Topological Quantum Computing Market 2020 Size by Product Analysis, Application, End-Users, Regional Outlook, Competitive Strategies and Forecast to…

New Jersey, United States,- Market Research Intellect aggregates the latest research on Topological Quantum Computing Market to provide a concise overview of market valuation, industry size, SWOT analysis, revenue approximation, and regional outlook for this business vertical. The report accurately addresses the major opportunities and challenges faced by competitors in this industry and presents the existing competitive landscape and corporate strategies implemented by the Topological Quantum Computing market players.

The Topological Quantum Computing market report gathers together the key trends influencing the growth of the industry with respect to competitive scenarios and regions in which the business has been successful. In addition, the study analyzes the various limitations of the industry and uncovers opportunities to establish a growth process. In addition, the report also includes a comprehensive research on industry changes caused by the COVID-19 pandemic, helping investors and other stakeholders make informed decisions.

Key highlights from COVID-19 impact analysis:

Unveiling a brief about the Topological Quantum Computing market competitive scope:

The report includes pivotal details about the manufactured products, and in-depth company profile, remuneration, and other production patterns.

The research study encompasses information pertaining to the market share that every company holds, in tandem with the price pattern graph and the gross margins.

Topological Quantum Computing Market, By Type

Topological Quantum Computing Market, By Application

Other important inclusions in the Topological Quantum Computing market report:

A brief overview of the regional landscape:

Reasons To Buy:

About Us:

Market Research Intellect provides syndicated and customized research reports to clients from various industries and organizations with the aim of delivering functional expertise. We provide reports for all industries including Energy, Technology, Manufacturing and Construction, Chemicals and Materials, Food and Beverage, and more. These reports deliver an in-depth study of the market with industry analysis, the market value for regions and countries, and trends that are pertinent to the industry.

Contact Us:

Mr. Steven Fernandes

Market Research Intellect

New Jersey ( USA )

Tel: +1-650-781-4080

Our Trending Reports

Laser Pointer Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Music Publishing Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Storage As A Service Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Indonesia Marine Lubricants Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

United States & Asia Low Smoke Halogen-Free Cable Materials Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

See original here:
Topological Quantum Computing Market 2020 Size by Product Analysis, Application, End-Users, Regional Outlook, Competitive Strategies and Forecast to...

Quantum Computing And The End Of Encryption – Hackaday

Quantum computers stand a good chance of changing the face computing, and that goes double for encryption. For encryption methods that rely on the fact that brute-forcing the key takes too long with classical computers, quantum computing seems like its logical nemesis.

For instance, the mathematical problem that lies at the heart of RSA and other public-key encryption schemes is factoring a product of two prime numbers. Searching for the right pair using classical methods takes approximately forever, but Shors algorithm can be used on a suitable quantum computer to do the required factorization of integers in almost no time.

When quantum computers become capable enough, the threat to a lot of our encrypted communication is a real one. If one can no longer rely on simply making the brute-forcing of a decryption computationally heavy, all of todays public-key encryption algorithms are essentially useless. This is the doomsday scenario, but how close are we to this actually happening, and what can be done?

To ascertain the real threat, one has to look at the classical encryption algorithms in use today to see which parts of them would be susceptible to being solved by a quantum algorithm in significantly less time than it would take for a classical computer. In particular, we should make the distinction between symmetric and asymmetric encryption.

Symmetric algorithms can be encoded and decoded with the same secret key, and that has to be shared between communication partners through a secure channel. Asymmetric encryption uses a private key for decryption and a public key for encryption onlytwo keys: a private key and a public key. A message encrypted with the public key can only be decrypted with the private key. This enables public-key cryptography: the public key can be shared freely without fear of impersonation because it can only be used to encrypt and not decrypt.

As mentioned earlier, RSA is one cryptosystem which is vulnerable to quantum algorithms, on account of its reliance on integer factorization. RSA is an asymmetric encryption algorithm, involving a public and private key, which creates the so-called RSA problem. This occurs when one tries to perform a private-key operation when only the public key is known, requiring finding the eth roots of an arbitrary number, modulo N. Currently this is unrealistic to classically solve for >1024 bit RSA key sizes.

Here we see again the thing that makes quantum computing so fascinating: the ability to quickly solve non-deterministic polynomial (NP) problems. Whereas some NP problems can be solved quickly by classical computers, they do this by approximating a solution. NP-complete problems are those for which no classical approximation algorithm can be devised. An example of this is the Travelling Salesman Problem (TSP), which asks to determine the shortest possible route between a list of cities, while visiting each city once and returning to the origin city.

Even though TSP can be solved with classical computing for smaller number of cities (tens of thousands), larger numbers require approximation to get within 1%, as solving them would require excessively long running times.

Symmetric encryption algorithms are commonly used for live traffic, with only handshake and the initial establishing of a connection done using (slower) asymmetric encryption as a secure channel for exchanging of the symmetric keys. Although symmetric encryption tends to be faster than asymmetric encryption, it relies on both parties having access to the shared secret, instead of being able to use a public key.

Symmetric encryption is used with forward secrecy (also known as perfect forward secrecy). The idea behind FS being that instead of only relying on the security provided by the initial encrypted channel, one also encrypts the messages before they are being sent. This way even if the keys for the encryption channel got compromised, all an attacker would end up with are more encrypted messages, each encrypted using a different ephemeral key.

FS tends to use Diffie-Hellman key exchange or similar, resulting in a system that is comparable to a One-Time Pad (OTP) type of encryption, that only uses the encryption key once. Using traditional methods, this means that even after obtaining the private key and cracking a single message, one has to spend the same effort on every other message as on that first one in order to read the entire conversation. This is the reason why many secure chat programs like Signal as well as increasingly more HTTPS-enabled servers use FS.

It was already back in 1996 that Lov Grover came up with Grovers algorithm, which allows for a roughly quadratic speed-up as a black box search algorithm. Specifically it finds with high probability the likely input to a black box (like an encryption algorithm) which produced the known output (the encrypted message).

As noted by Daniel J. Bernstein, the creation of quantum computers that can effectively execute Grovers algorithm would necessitate at least the doubling of todays symmetric key lengths. This in addition to breaking RSA, DSA, ECDSA and many other cryptographic systems.

The observant among us may have noticed that despite some spurious marketing claims over the past years, we are rather short on actual quantum computers today. When it comes to quantum computers that have actually made it out of the laboratory and into a commercial setting, we have quantum annealing systems, with D-Wave being a well-known manufacturer of such systems.

Quantum annealing systems can only solve a subset of NP-complete problems, of which the travelling salesman problem, with a discrete search space. It would for example not be possible to run Shors algorithm on a quantum annealing system. Adiabatic quantum computation is closely related to quantum annealing and therefore equally unsuitable for a general-purpose quantum computing system.

This leaves todays quantum computing research thus mostly in the realm of simulations, and classical encryption mostly secure (for now).

When can we expect to see quantum computers that can decrypt every single one of our communications with nary any effort? This is a tricky question. Much of it relies on when we can get a significant number of quantum bits, or qubits, together into something like a quantum circuit model with sufficient error correction to make the results anywhere as reliable as those of classical computers.

At this point in time one could say that we are still trying to figure out what the basic elements of a quantum computer will look like. This has led to the following quantum computing models:

Of these four models, quantum annealing has been implemented and commercialized. The others have seen many physical realizations in laboratory settings, but arent up to scale yet. In many ways it isnt dissimilar to the situation that classical computers found themselves in throughout the 19th and early 20th century when successive computers found themselves moving from mechanical systems to relays and valves, followed by discrete transistors and ultimately (for now) countless transistors integrated into singular chips.

It was the discovery of semiconducting materials and new production processes that allowed classical computers to flourish. For quantum computing the question appears to be mostly a matter of when well manage to do the same there.

Even if in a decade or more from the quantum computing revolution will suddenly make our triple-strength, military-grade encryption look as robust as DES does today, we can always comfort ourselves with the knowledge that along with quantum computing we are also increasingly learning more about quantum cryptography.

In many ways quantum cryptography is even more exciting than classical cryptography, as it can exploit quantum mechanical properties. Best known is quantum key distribution (QKD), which uses the process of quantum communication to establish a shared key between two parties. The fascinating property of QKD is that the mere act of listening in on this communication will cause measurable changes. Essentially this provides unconditional security in distributing symmetric key material, and symmetric encryption is significantly more quantum-resistant.

All of this means that even if the coming decades are likely to bring some form of upheaval that may or may not mean the end of classical computing and cryptography with it, not all is lost. As usual, science and technology with it will progress, and future generations will look back on todays primitive technology with some level of puzzlement.

For now, using TLS 1.3 and any other protocols that support forward secrecy, and symmetric encryption in general, is your best bet.

Read the original:
Quantum Computing And The End Of Encryption - Hackaday

Better encryption for wireless privacy at the dawn of quantum computing – UC Riverside

For the widest possible and mobile Internet coverage, wireless communications are essential. But due to the open nature of wireless transmissions, information security is a unique issue of challenge. The widely deployed methods for information security are based on digital encryption, which in turn requires two or more legitimate parties to share a secret key.

The distribution of a secrecy key via zero-distance physical contact is inconvenient in general and impossible in situations where too little time is available. The conventional solution to this challenge is to use the public-key infrastructure, or PKI, for secret key distribution. Yet, PKI is based on computational hardness of factoring, for example, which is known to be increasingly threatened by quantum computing. Some predictions suggest that such a threat could become a reality within 15 years.

In order to provide Internet coverage for every possible spot on the planet, such as remote islands and mountains, a low-orbiting satellite communication network is rapidly being developed. A satellite can transmit or receive streams of digital information to or from terrestrial stations. But the geographical exposure of these streams is large and easily prone to eavesdropping. For applications such as satellite communications, how can we guarantee information security even if quantum computers become readily available in the near future?

Yingbo Huas Lab of Signals, Systems and Networks in the Department of Electrical and Computer Engineering, which has been supported in part by Army, has aimed to develop reliable and secure transmission, or RESET, schemes for future wireless networks. RESET guarantees that the secret information is not only received reliably by legitimate receiver but also secure from eavesdropper with any channel superiority.

In particular, Huas Lab has developed a physical layer encryption method that could be immune to the threat of quantum computing. They are actively engaged in further research of this and other related methods.

For the physical layer encryption proposed by Huas lab, only partial information is extracted from randomized matrices such as the principal singular vector of each matrix modulated by secret physical feature approximately shared by legitimate parties. The principal singular vector of a matrix is not a reversible function of the matrix. This seems to suggest that a quantum computer is unable to perform a task that is rather easy on a classical computer. If this is true, then the physical layer encryption should be immune from attacks via quantum computing. Unlike the number theory based encryption methods which are vulnerable to quantum attacks, Huas physical layer encryption is based on continuous encryption functions that are still yet to be developed.

Read the original here:
Better encryption for wireless privacy at the dawn of quantum computing - UC Riverside

What’s New in HPC Research: Astronomy, Weather, Security & More – HPCwire

In this bimonthly feature,HPCwirehighlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here.

Developing the HPC system for the ASKAP telescope

The Australian Square Kilometre Array Pathfinder (ASKAP) telescope (itself a pilot project for the record-setting Square Kilometre Array planned for construction in the coming years) will enable highly sensitive radio astronomy that produces a tremendous amount of data. In this paper, researchers from the Commonwealth Scientific and Industrial Research Organisation (CSIRO) highlight how they are preparing a dedicated HPC platform, called ASKAPsoft, to handle the expected 5 PB/year of data produced by ASKAP.

Authors: Juan C. Guzman, Eric Bastholm, Wasim raja, Matthew Whiting, Daniel Mitchell, Stephen Ord and Max Voronkov.

Creating an open infrastructure for sharing and reusing HPC knowledge

In an expert field like HPC, institutional memory and information-sharing is crucial for maintaining and building on expertise but institutions often lack cohesive infrastructures to perpetuate that knowledge. These authors, a team from North Carolina State University and Lawrence Livermore National Laboratory, introduce OpenK, an open, ontology-based infrastructure aimed at facilitating the accumulation, sharing and reuse of HPC knowledge.

Authors: Yue Zhao, Xipeng Shen and Chunhua Liao.

Using high-performance data analysis to facilitate HPC-powered astrophysics

High-performance data analysis (HPDA) is an emerging tool for scientific disciplines like bioscience, climate science and security and now, its being used to prepare astrophysics research for exascale. In this paper, written by a team from the Astronomical Observatory of Trieste, Italy, the authors discuss the ExaNeSt and EuroExa projects, which built a prototype of a low-power exascale facility for HPDA and astrophysics.

Authors: Giuliano Taffoni, David Goz, Luca Tornatore, Marco Frailis, Gianmarco Maggio and Fabio Pasian.

Using power analysis to identify HPC activity

Monitoring users on large computing platforms such as [HPC] and cloud computing systems, these authors a duo from Lawrence Berkeley National Laboratory write, is non-trivial. Users can (and have) abused access to HPC systems, they say, but process viewers and other monitoring tools can impose substantial overhead. To that end, they introduce a technique for identifying running programs with 97% accuracy using just the systems power consumption.

Authors: Bogdan Copos and Sein Peisert.

Building resilience and fault tolerance in HPC for numerical weather and climate prediction

In numerical weather and climate prediction (NWP), accuracy depends strongly on available computing power but the increasing number of cores in top systems is leading to a higher frequency of hardware and software failures for NWP simulations. This report (from researchers at eight different institutions) examines approaches for fault tolerance in numerical algorithms and system resilience in parallel simulations for those NWP tools.

Authors: Tommaso Benacchio, Luca Bonaventura, Mirco Altenbernd, Chris D. Cantwell, Peter D. Dben, Mike Gillard, Luc Giraud, Dominik Gddeke, Erwan Raffin, Keita Teranishi and Nils Wedi.

Pioneering the exascale era with astronomy

Another team this time, from SURF, a collaborative organization for Dutch research also investigated the intersection of astronomy and the exascale era. This paper, written by three researchers from SURF, highlights a new, OpenStack-based cloud infrastructure layer and Spider, a new addition to SURFs high-throughput data processing platform. The authors explore how these additions help to prepare the astronomical research community for the exascale era, in particular with regard to data-intensive experiments like the Square Kilometre Array.

Authors: J. B. R. Oonk, C. Schrijvers and Y. van den Berg.

Enabling EASEY deployment of containerized applications for future HPC systems

As the exascale era approaches, HPC systems are growing in complexity, improving performance but making the systems less accessible for new users. These authors a duo from the Ludwig Maximilian University of Munich propose a support framework for these future HPC architectures called EASEY (for Enable exAScale for EverYone) that can automatically deploy optimized container computations with negligible overhead[.]

Authors: Maximilian Hb and Dieter Kranzlmller.

Do you know about research that should be included in next months list? If so, send us an email at[emailprotected]. We look forward to hearing from you.

Original post:
What's New in HPC Research: Astronomy, Weather, Security & More - HPCwire

House Introduces the Advancing Quantum Computing Act – Lexology

On May 19, 2020, Representative Morgan Griffith (R-VA-9) introduced the Advancing Quantum Computing Act (AQCA), which would require the Secretary of Commerce to conduct a study on quantum computing. We cant depend on other countries . . . to guarantee American economic leadership, shield our stockpile of critical supplies, or secure the benefits of technological progress to our people, Representative Griffith explained. It is up to us to do that.

Quantum computers use the science underlying quantum mechanics to store data and perform computations. The properties of quantum mechanics are expected to enable such computers to outperform traditional computers on a multitude of metrics. As such, there are many promising applications, from simulating the behavior of matter to accelerating the development of artificial intelligence. Several companies have started exploring the use of quantum computing to develop new drugs, improve the performance of batteries, and optimize transit routing to minimize congestion.

In addition to the National Quantum Initiative Act passed in 2018, the introduction of AQCA represents another importantalbeit preliminarystep for Congress in helping to shape the growth and development of quantum computing in the United States. It signals Congresss continuing interest in developing a national strategy for the technology.

Overall, the AQCA would require the Secretary of Commerce to conduct the following four categories of studies related to the impact of quantum computing:

Original post:
House Introduces the Advancing Quantum Computing Act - Lexology

Russian Scientist Gets Award For Breakthrough Research In The Development Of Quantum Computers – Modern Ghana

St. Petersburg State University professor Alexey Kavokin has received the international Quantum Devices Award in recognition of his breakthrough research in the development of quantum computers. Professor Kavokin is the first Russian scientist to be awarded this honorary distinction.

Aleksey Kavokins scientific effort has contributed to the creation of polariton lasers that consume several times less energy compared to the conventional semiconductor lasers. And most importantly, polariton lasers can eventually set the stage for the development of qubits, basic elements of quantum computers of the future. These technologies contribute significantly to the development of quantum computing systems.

The Russian scientists success stems from the fact that the Russian Federation is presently a world leader in polaritonics, a field of science that deals with light-material quasiparticles, or liquid light.

Polaritonics is the electronics of the future, Alexey Kavokin says. Developed on the basis of liquid light, polariton lasers can put our country ahead of the whole world in the quantum technologies race. Replacing the electric current with light in computer processors alone can save billions of dollars by reducing heat loss during information transfer.

This talented physicist believes that the US giants, such as Google and IBM are investing heavily in quantum technologies based on superconductors, Russian scientists are pursuing a much cheaper and potentially more promising path to developing a polariton platform for quantum computing.

Alexey Kavokin heads the Igor Uraltsev Spin Optics Laboratory at St. Petersburg State University, funded by a mega-grant provided by the Russian government. He is also head of the Quantum Polaritonics group at the Russian Quantum Center. Alexey Kavokin is Professor at the University of Southampton (England), where he heads the Department of Nanophysics and Photonics. He is Scientific Director of the Mediterranean Institute of Fundamental Physics (Italy). In 2018, he headed the International Center for Polaritonics at Westlake University in Hangzhou, China.

The Quantum Devices Award was founded in 2000 for innovative contribution to the field of complex semiconductor devices and devices with quantum nanostructures. It is funded by the Japanese section of the steering committee of the International Symposium on Compound Semiconductors (ISCS). The Quantum Devices Award was previously conferred on scientists from Japan, Switzerland, Germany, and other countries, but it is the first time that the award has been received by a scientist from Russia.

Due to the coronavirus pandemic, it was decided that the award presentation will be held next year in Sweden.

Read more:
Russian Scientist Gets Award For Breakthrough Research In The Development Of Quantum Computers - Modern Ghana

WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography -…

WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography

With more than 25% of its 2019 annual turnover invested in R&D, WISeKey is a significant and recognized contributor to digital trust in an interconnected world. The Companys recent publication and a conference presentation about post-quantum cryptography illustrates once again that innovation is at the heart of the Company.

WISeKey is involved in this NIST PQC (Post-Quantum Cryptography) program with the only objective of providing future-proof digital security solutions based on existing and new hardware architectures

Geneva, Switzerland May 28, 2020: WISeKey International Holding Ltd. (WISeKey) (SIX: WIHN, NASDAQ: WKEY), a leading global cybersecurity and IoT company, published today a technical article (https://www.wisekey.com/articles-white-papers/) discussing how to guarantee digital security and protect against hackers who will take advantage of the power of quantum information science. This research was presented (video here: https://www.wisekey.com/videos/) during the remote International Workshop on Code-Based Cryptography (CBCrypto 2020 Zagreb, Croatia May 9-10 2020).

IoT products are a major component of the 4th industrial revolution which brings together advances in computational power, semiconductors, blockchain, wireless communication, AI and data to build a vast technology infrastructure that works nearly autonomously.

According to a recent report published by Fortune Business Insights and titled Internet of Things (IoT) Market Size, Share and Industry Analysis By Platform (Device Management, Application Management, Network Management), By Software & Services (Software Solution, Services), By End-Use Industry (BFSI, Retail, Governments, Healthcare, Others) And Regional Forecast, 2019 2026., the IoT market was valued at USD 190.0 billion in 2018. It is projected to reach USD 1,102.6 billion by 2026, with a CAGR of 24.7% in the forecast period. Huge advances in manufacturing have allowed even small manufacturers to produce relatively sophisticated IoT products. This brings to the surface issues related to patents governing IoT products and communication standards governing devices.

Studies about quantum computing, namely how to use quantum mechanical phenomena to perform computation, were initiated in the early 1980s. The perspectives are endless and the future computers will get an incredible computing power when using this technology. When used by hackers, these computers will become a risk to cybersecurity: all the cryptographic algorithms used today to secure our digital world are exposed. Therefore, the US National Institute of Standards and Technology (NIST) launched in 2016 a wide campaign to find new resistant algorithms.

WISeKeys R&D department is very much involved in this NIST PQC (Post-Quantum Cryptography) program with the only objective to provide the market with future-proof digital security solutions based on existing and new hardware architectures. The new article reports one of the Companys current contributions to this safer cyber future. ROLLO-I, a NIST shortlisted algorithm, was implemented on some of WISeKeys secure chips (MS600x secure microcontrollers, VaultIC secure elements, ) with countermeasures to make them robust against attacks.

Although nobody exactly knows when quantum computers are going to be massively available, this is certainly going to happen. WISeKey is significantly investing to develop new technologies and win this race.

With a rich portfolio of more than 100 fundamental individual patents and 20 pending ones in various domains including the design of secure chips, Near Field Communication (NFC), the development of security firmware and backend software, the secure management of data, the improvement of security protocols between connected objects and advanced cryptography, to mention a few, WISeKey has become a key technology provider in the cybersecurity arena, says Carlos Moreira, Founder and CEO of WISeKey. This precious asset makes WISeKey the right Digital Trust Partner to deploy the current and future Internet of Everything.

Want to know more about WISeKeys Intellectual Properties? Please visit our website: https://www.wisekey.com/patents/.

About WISeKey

WISeKey (NASDAQ: WKEY; SIX Swiss Exchange: WIHN) is a leading global cybersecurity company currently deploying large scale digital identity ecosystems for people and objects using Blockchain, AI and IoT respecting the Human as the Fulcrum of the Internet. WISeKey microprocessors secure the pervasive computing shaping todays Internet of Everything. WISeKey IoT has an install base of over 1.5 billion microchips in virtually all IoT sectors (connected cars, smart cities, drones, agricultural sensors, anti-counterfeiting, smart lighting, servers, computers, mobile phones, crypto tokens etc.). WISeKey is uniquely positioned to be at the edge of IoT as our semiconductors produce a huge amount of Big Data that, when analyzed with Artificial Intelligence (AI), can help industrial applications to predict the failure of their equipment before it happens.

Our technology is Trusted by the OISTE/WISeKeys Swiss based cryptographic Root of Trust (RoT) provides secure authentication and identification, in both physical and virtual environments, for the Internet of Things, Blockchain and Artificial Intelligence. The WISeKey RoT serves as a common trust anchor to ensure the integrity of online transactions among objects and between objects and people. For more information, visitwww.wisekey.com.

Press and investor contacts:

Disclaimer:This communication expressly or implicitly contains certain forward-looking statements concerning WISeKey International Holding Ltd and its business. Such statements involve certain known and unknown risks, uncertainties and other factors, which could cause the actual results, financial condition, performance or achievements of WISeKey International Holding Ltd to be materially different from any future results, performance or achievements expressed or implied by such forward-looking statements. WISeKey International Holding Ltd is providing this communication as of this date and does not undertake to update any forward-looking statements contained herein as a result of new information, future events or otherwise.This press release does not constitute an offer to sell, or a solicitation of an offer to buy, any securities, and it does not constitute an offering prospectus within the meaning of article 652a or article 1156 of the Swiss Code of Obligations or a listing prospectus within the meaning of the listing rules of the SIX Swiss Exchange. Investors must rely on their own evaluation of WISeKey and its securities, including the merits and risks involved. Nothing contained herein is, or shall be relied on as, a promise or representation as to the future performance of WISeKey.

Originally posted here:
WISeKey is Adapting its R&D and Extended Patents Portfolio to the Post-COVID 19 Economy with Specific Focus on Post-Quantum Cryptography -...

Total Partners with CQC to Improve CO2 Capture – Energy Industry Review

Total is stepping up its research into Carbon Capture, Utilization and Storage (CCUS) technologies by signing a multi-year partnership with UK start-up Cambridge Quantum Computing (CQC). This partnership aims to develop new quantum algorithms to improve materials for CO2 capture. Totals ambition is to be a major player in CCUS and the Group currently invests up to 10% of its annual research and development effort in this area.

To improve the capture of CO2, Total is working on nanoporous materials called adsorbents, considered to be among the most promising solutions. These materials could eventually be used to trap the CO2 emitted by the Groups industrial operations or those of other players (cement, steel etc.). The CO2 recovered would then be concentrated and reused or stored permanently. These materials could also be used to capture CO2 directly from the air (Direct Air Capture or DAC).

The quantum algorithms which will be developed in the collaboration between Total and CQC will simulate all the physical and chemical mechanisms in these adsorbents as a function of their size, shape and chemical composition, and therefore make it possible to select the most efficient materials to develop. Currently, such simulations are impossible to perform with a conventional supercomputer, which justifies the use of quantum calculations.

Total is very pleased to be launching this new collaboration with Cambridge Quantum Computing: quantum computing opens up new possibilities for solving extremely complex problems. We are therefore among the first to use quantum computing in our research to design new materials capable of capturing CO2 more efficiently. In this way, Total intends to accelerate the development of the CCUS technologies that are essential to achieve carbon neutrality in 2050, said Marie-Nolle Semeria, Totals CTO.

We are very excited to be working with Total, a demonstrated thought-leader in CCUS technology. Carbon neutrality is one of the most significant topics of our time and incredibly important to the future of the planet. Total has a proven long-term commitment to CCUS solutions. We are hopeful that our work will lead to meaningful contributions and an acceleration on the path to carbon neutrality, Ilyas Khan, CEO of CQC, mentioned.

Total is deploying an ambitious R&D programme, worth nearly USD 1 billion a year. Total R&D relies on a network of more than 4,300 employees in 18 research centres around the world, as well as on numerous partnerships with universities, start-ups and industrial companies. Its investments are mainly devoted to a low-carbon energy mix (40%) as well as to digital, safety and the environment, operational efficiency and new products. It files more than 200 patents every year.

Original post:
Total Partners with CQC to Improve CO2 Capture - Energy Industry Review

Total partners with Cambridge Quantum Computing on CO2 capture – Green Car Congress

Total is stepping up its research into Carbon Capture, Utilization and Storage (CCUS) technologies by signing a multi-year partnership with UK start-up Cambridge Quantum Computing (CQC). This partnership aims to develop new quantum algorithms to improve materials for CO2 capture.

Totals ambition is to be a major player in CCUS and the Group currently invests up to 10% of its annual research and development effort in this area.

To improve the capture of CO2, Total is working on nanoporous adsorbents, considered to be among the most promising solutions. These materials could eventually be used to trap the CO2 emitted by the Groups industrial operations or those of other players (cement, steel etc.). The CO2 recovered would then be concentrated and reused or stored permanently. These materials could also be used to capture CO2 directly from the air (Direct Air Capture or DAC).

The quantum algorithms which will be developed in the collaboration between Total and CQC will simulate all the physical and chemical mechanisms in these adsorbents as a function of their size, shape and chemical composition, and therefore make it possible to select the most efficient materials to develop.

Currently, such simulations are impossible to perform with a conventional supercomputer, which justifies the use of quantum calculations.

Go here to see the original:
Total partners with Cambridge Quantum Computing on CO2 capture - Green Car Congress

Archer in trading halt pending material agreement over quantum computing tech – Stockhead

Super diversified quantum computing/health tech/battery metals play ArcherMaterials (ASX:AXE) is in a trading halt as it finalises a material agreement over its 12CQ quantum computing chip technology.

Globally, the race is on to develop quantum computers, which will operate at speeds eclipsing that of classic computers.

The nascent, rapidly growing quantum computing sector has the potential to impact a lot of sectors, offering potential solutions to complex computation, cryptography and simulation problems.

In late 2019, Tractica predicted that total quantum computing market revenue will reach $US9.1 billion ($14.06 billion) annually by 2030, up from $US111.6 million in 2018.

READ: What the heck is quantum computing and is it worth investing in?

But data is stored in qubits (like a classical computers data is stored in bits), and many quantum computers require their qubits to be cooled to nearly absolute zero to prevent errors occurring.

This is where Archers tech comes in it is developing a quantum computer chip that, if successful, will allow quantum computers to be mobile and operate at room temperature.

During the March quarter, Archer kicked off the next stage of the development of its 12CQ project focussed on completing the quantum measurements required to build a working chip prototype.

Archer will remain in trading halt until the earlier of the material announcement to the market, or the commencement of trade on Tuesday, 5 May.

NOW READ: 5 tech trends well see more of in 2020 & the small caps that are front and centre

Get the latest Stock & Small Caps news and insights direct to your inbox.

Read the original here:
Archer in trading halt pending material agreement over quantum computing tech - Stockhead

Physicists Criticize Stephen Wolfram’s ‘Theory of Everything’ – Scientific American

Stephen Wolfram blames himself for not changing the face of physics sooner.

I do fault myself for not having done this 20 years ago, the physicist turned software entrepreneur says. To be fair, I also fault some people in the physics community for trying to prevent it happening 20 years ago. They were successful. Back in 2002, after years of labor, Wolfram self-published A New Kind of Science, a 1,200-page magnum opus detailing the general idea that nature runs on ultrasimple computational rules. The book was an instant best seller and received glowing reviews: the New York Times called it a first-class intellectual thrill. But Wolframs arguments found few converts among scientists. Their work carried on, and he went back to running his software company Wolfram Research. And that is where things remaineduntil last month, when, accompanied by breathless press coverage (and a 448-page preprint paper), Wolfram announced a possible path to the fundamental theory of physics based on his unconventional ideas. Once again, physicists are unconvincedin no small part, they say, because existing theories do a better job than his model.

At its heart, Wolframs new approach is a computational picture of the cosmosone where the fundamental rules that the universe obeys resemble lines of computer code. This code acts on a graph, a network of points with connections between them, that grows and changes as the digital logic of the code clicks forward, one step at a time. According to Wolfram, this graph is the fundamental stuff of the universe. From the humble beginning of a small graph and a short set of rules, fabulously complex structures can rapidly appear. Even when the underlying rules for a system are extremely simple, the behavior of the system as a whole can be essentially arbitrarily rich and complex, he wrote in a blog post summarizing the idea. And this got me thinking: Could the universe work this way? Wolfram and his collaborator Jonathan Gorard, a physics Ph.D. candidate at the University of Cambridge and a consultant at Wolfram Research, found that this kind of model could reproduce some of the aspects of quantum theory and Einsteins general theory of relativity, the two fundamental pillars of modern physics.

But Wolframs models ability to incorporate currently accepted physics is not necessarily that impressive. Its this sort of infinitely flexible philosophy where, regardless of what anyone said was true about physics, they could then assert, Oh, yeah, you could graft something like that onto our model, says Scott Aaronson, a quantum computer scientist at the University of Texas at Austin.

When asked about such criticisms, Gorard agreesto a point. Were just kind of fitting things, he says. But we're only doing that so we can actually go and do a systematized search for specific rules that fit those of our universe.

Wolfram and Gorard have not yet found any computational rules meeting those requirements, however. And without those rules, they cannot make any definite, concrete new predictions that could be experimentally tested. Indeed, according to critics, Wolframs model has yet to even reproduce the most basic quantitative predictions of conventional physics. The experimental predictions of [quantum physics and general relativity] have been confirmed to many decimal placesin some cases, to a precision of one part in [10 billion], says Daniel Harlow, a physicist at the Massachusetts Institute of Technology. So far I see no indication that this could be done using the simple kinds of [computational rules] advocated by Wolfram. The successes he claims are, at best, qualitative. Further, even that qualitative success is limited: There are crucial features of modern physics missing from the model. And the parts of physics that it can qualitatively reproduce are mostly there because Wolfram and his colleagues put them in to begin with. This arrangement is akin to announcing, If we suppose that a rabbit was coming out of the hat, then remarkably, this rabbit would be coming out of the hat, Aaronson says. And then [going] on and on about how remarkable it is.

Unsurprisingly, Wolfram disagrees. He claims that his model has replicated most of fundamental physics already. From an extremely simple model, were able to reproduce special relativity, general relativity and the core results of quantum mechanics, he says, which, of course, are what have led to so many precise quantitative predictions of physics over the past century.

Even Wolframs critics acknowledge he is right about at least one thing: it is genuinely interesting that simple computational rules can lead to such complex phenomena. But, they hasten to add, that is hardly an original discovery. The idea goes back long before Wolfram, Harlow says. He cites the work of computing pioneers Alan Turing in the 1930s and John von Neumann in the 1950s, as well as that of mathematician John Conway in the early 1970s. (Conway, a professor at Princeton University, died of COVID-19 last month.) To the contrary, Wolfram insists that he was the first to discover that virtually boundless complexity could arise from simple rules in the 1980s. John von Neumann, he absolutely didnt see this, Wolfram says. John Conway, same thing.

Born in London in 1959, Wolfram was a child prodigy who studied at Eton College and the University of Oxford before earning a Ph.D. in theoretical physics at the California Institute of Technology in 1979at the age of 20. After his Ph.D., Caltech promptly hired Wolfram to work alongside his mentors, including physicist Richard Feynman. I dont know of any others in this field that have the wide range of understanding of Dr. Wolfram, Feynman wrote in a letter recommending him for the first ever round of MacArthur genius grants in 1981. He seems to have worked on everything and has some original or careful judgement on any topic. Wolfram won the grantat age 21, making him among the youngest ever to receive the awardand became a faculty member at Caltech and then a long-term member at the Institute for Advanced Study in Princeton, N.J. While at the latter, he became interested in simple computational systems and then moved to the University of Illinois in 1986 to start a research center to study the emergence of complex phenomena. In 1987 he founded Wolfram Research, and shortly after he left academia altogether. The software companys flagship product, Mathematica, is a powerful and impressive piece of mathematics software that has sold millions of copies and is today nearly ubiquitous in physics and mathematics departments worldwide.

Then, in the 1990s, Wolfram decided to go back to scientific researchbut without the support and input provided by a traditional research environment. By his own account, he sequestered himself for about a decade, putting together what would eventually become A New Kind of Science with the assistance of a small army of his employees.

Upon the release of the book, the media was ensorcelled by the romantic image of the heroic outsider returning from the wilderness to single-handedly change all of science. Wired dubbed Wolfram the man who cracked the code to everything on its cover. Wolfram has earned some bragging rights, the New York Times proclaimed. No one has contributed more seminally to this new way of thinking about the world. Yet then, as now, researchers largely ignored and derided his work. Theres a tradition of scientists approaching senility to come up with grand, improbable theories, the late physicist Freeman Dyson told Newsweek back in 2002. Wolfram is unusual in that hes doing this in his 40s.

Wolframs story is exactly the sort that many people want to hear, because it matches the familiar beats of dramatic tales from science history that they already know: the lone genius (usually white and male), laboring in obscurity and rejected by the establishment, emerges from isolation, triumphantly grasping a piece of the Truth. But that is rarelyif everhow scientific discovery actually unfolds. There are examples from the history of science that superficially fit this image: Think of Albert Einstein toiling away on relativity as an obscure Swiss patent clerk at the turn of the 20th century. Or, for a more recent example, consider mathematician Andrew Wiles working in his attic for years to prove Fermats last theorem before finally announcing his success in 1995. But portraying those discoveries as the work of a solo genius, romantic as it is, belies the real working process of science. Science is a group effort. Einstein was in close contact with researchers of his day, and Wiless work followed a path laid out by other mathematicians just a few years before he got started. Both of them were active, regular participants in the wider scientific community. And even so, they remain exceptions to the rule. Most major scientific breakthroughs are far more collaborativequantum physics, for example, was developed slowly over a quarter-century by dozens of physicists around the world.

I think the popular notion that physicists are all in search of the eureka moment in which they will discover the theory of everything is an unfortunate one, says Katie Mack, a cosmologist at North Carolina State University. We do want to find better, more complete theories. But the way we go about that is to test and refine our models, look for inconsistencies and incrementally work our way toward better, more complete models.

Most scientists would readily tell you that their discipline isand always has beena collaborative, communal process. Nobody can revolutionize a scientific field without first getting the critical appraisal and eventual validation of their peers. Today this requirement is performed through peer reviewa process Wolframs critics say he has circumvented with his announcement. Certainly theres no reason that Wolfram and his colleagues should be able to bypass formal peer review, Mack says. And they definitely have a much better chance of getting useful feedback from the physics community if they publish their results in a format we actually have the tools to deal with.

Mack is not alone in her concerns. Its hard to expect physicists to comb through hundreds of pages of a new theory out of the blue, with no buildup in the form of papers, seminars and conference presentations, says Sean Carroll, a physicist at Caltech. Personally, I feel it would be more effective to write short papers addressing specific problems with this kind of approach rather than proclaiming a breakthrough without much vetting.

So why did Wolfram announce his ideas this way? Why not go the traditional route? I don't really believe in anonymous peer review, he says. I think its corrupt. Its all a giant story of somewhat corrupt gaming, I would say. I think its sort of inevitable that happens with these very large systems. Its a pity.

So what are Wolframs goals? He says he wants the attention and feedback of the physics community. But his unconventional approachsoliciting public comments on an exceedingly long paperalmost ensures it shall remain obscure. Wolfram says he wants physicists respect. The ones consulted for this story said gaining it would require him to recognize and engage with the prior work of others in the scientific community.

And when provided with some of the responses from other physicists regarding his work, Wolfram is singularly unenthused. Im disappointed by the naivete of the questions that youre communicating, he grumbles. I deserve better.

The rest is here:
Physicists Criticize Stephen Wolfram's 'Theory of Everything' - Scientific American

QUANTUM COMPUTING INC. : Entry into a Material Definitive Agreement, Creation of a Direct Financial Obligation or an Obligation under an Off-Balance…

Item 1.01 Entry into a Material Definitive Agreement.

On May 6, 2020, Quantum Computing Inc. (the "Company") executed an unsecuredpromissory note (the "Note") with BB&T/Truist Bank N.A. to evidence a loan tothe Company in the amount of $218,371 (the "Loan") under the Paycheck ProtectionProgram (the "PPP") established under the Coronavirus Aid, Relief, and EconomicSecurity Act (the "CARES Act"), administered by the U.S. Small BusinessAdministration (the "SBA").

In accordance with the requirements of the CARES Act, the Company expects to usethe proceeds from the Loan exclusively for qualified expenses under the PPP,including payroll costs, mortgage interest, rent and utility costs. Interestwill accrue on the outstanding balance of the Note at a rate of 1.00% per annum.The Company expects to apply for forgiveness of up to the entire amount of theNote. Notwithstanding the Company's eligibility to apply for forgiveness, noassurance can be given that the Company will obtain forgiveness of all or anyportion of the amounts due under the Note. The amount of forgiveness under theNote is calculated in accordance with the requirements of the PPP, including theprovisions of Section 1106 of the CARES Act, subject to limitations and ongoingrule-making by the SBA and the maintenance of employee and compensation levels.

Subject to any forgiveness granted under the PPP, the Note is scheduled tomature two years from the date of first disbursement under the Note. The Notemay be prepaid at any time prior to maturity with no prepayment penalties. TheNote provides for customary events of default, including, among others, thoserelating to failure to make payments, bankruptcy, and significant changes inownership. The occurrence of an event of default may result in the requiredimmediate repayment of all amounts outstanding and/or filing suit and obtainingjudgment against the Company. The Company's obligations under the Note are notsecured by any collateral or personal guarantees.

Item 2.03 Creation of Direct Financial Obligation or an Obligation under an

The discussion of the Loan set forth in Item 1.01 of this Current Report on Form8-K is incorporated in this Item 2.03 by reference.

Item 9.01. Financial Statements and Exhibits.

Edgar Online, source Glimpses

Visit link:
QUANTUM COMPUTING INC. : Entry into a Material Definitive Agreement, Creation of a Direct Financial Obligation or an Obligation under an Off-Balance...

Wiring the Quantum Computer of the Future: Researchers from Japan and Australia propose a novel 2D design – QS WOW News

The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges. Efficient quantum computing is expected to enable advancements that are impossible with classical computers. A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and the University of Technology, Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry, and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when the error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons, or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect, making their construction a significant engineering challenge. https://youtu.be/14a__swsYSU

The team of scientists led by Prof Jaw-Shen Tsai has proposed a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.

The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system. The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. The results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other, thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to build large-scale fault-tolerant quantum computers, the findings of this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states. The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.

Continued here:
Wiring the Quantum Computer of the Future: Researchers from Japan and Australia propose a novel 2D design - QS WOW News

Global Quantum Computing Market : Industry Analysis and Forecast (2020-2027) – MR Invasion

Global Quantum Computing Marketwas valued US$ 198.31 Mn in 2019 and is expected to reach US$ 890.5 Mn by 2027, at CAGR of 28.44 % during forecast.

The report study has analyzed revenue impact of covid-19 pandemic on the sales revenue of market leaders, market followers and disrupters in the report and same is reflected in our analysis.

REQUEST FOR FREE SAMPLE REPORT:https://www.maximizemarketresearch.com/request-sample/27533/

Quantum computing market growth is being driven by factors like increasing incidences of cybercrime, early adoption of quantum computing technology in automotive and defense industry, and growing investments by government entities in quantum computing market. On the other hand, presence of substitute technology and reluctance to accept new technology are factors limiting the growth of quantum computing market.

Quantum computing market in the energy & power industry is projected to witness a CAGR of 40% from 2017 to 2023. This growth is primarily attributed to the beneficial opportunities existing in the nuclear and renewable sector. Applications like energy exploration, seismic survey optimization, and reservoir optimization are estimated to lead this industry in quantum computing market.

North America was holding the largest market share of quantum computing market in 2016. North America is a key market as it is the home ground for some of the major corporations like D-Wave Systems Inc., 1QB Information Technologies, Inc. The increased research and development (R&D) activities in the sector of quantum computing are directed in this region as well as the heavy investments by government activities and technologically advanced players International Business Machines Corporation, Microsoft Corporation, Google Inc., and Intel Corporation are factors driving the growth of quantum computing market in North America. The R&D at industry levels is extending the application areas of the quantum computing market in various industries like energy & power, defense, and chemicals, especially in US.

Owing to the economic interest and decline of Moores law of computational scaling, eighteen of the worlds biggest corporations and dozens of government organizations are working on quantum processor technologies and quantum software or associating with the quantum industry startups like D-Wave. Their determination reflects a wider transition, taking place at start-ups and academic research labs like move from pure science towards engineering.

Quantum computing market report evaluates the technology, companies/associations, R&D efforts, and potential solutions assisted by quantum computing. It also estimates the impact of quantum computing on other major technologies and solution areas with AI, chipsets, edge computing, blockchain, IoT, big data analytics, and smart cities. This report offers global and regional forecasts as well the viewpoint for quantum computing impact on hardware, software, applications, and services

DO INQUIRY BEFORE PURCHASING REPORT HERE:https://www.maximizemarketresearch.com/inquiry-before-buying/27533/

The objective of the report is to present a comprehensive assessment of the market and contains thoughtful insights, facts, historical data, industry-validated market data and projections with a suitable set of assumptions and methodology. The report also helps in understanding Quantum Computing market dynamics, structure by identifying and analyzing the market segments and project the global market size. Further, report also focuses on competitive analysis of key players by product, price, financial position, product portfolio, growth strategies, and regional presence. The report also provides PEST analysis, PORTERs analysis, SWOT analysis to address questions of shareholders to prioritizing the efforts and investment in near future to emerging segment in Quantum Computing market.Scope of Global Quantum Computing Market:

Global Quantum Computing Market, by Technology:

Superconducting loops technology Trapped ion technology Topological qubits technologyGlobal Quantum Computing Market, by Application:

Simulation Optimization SamplingGlobal Quantum Computing Market, by Component:

Hardware Software ServicesGlobal Quantum Computing Market, by Industry:

Defense Banking & Finance Energy & Power Chemicals Healthcare & PharmaceuticalsGlobal Quantum Computing Market, by Region:

North America Asia Pacific Europe Latin America Middle East & AfricaKey Players Operating in Market Include:

D-Wave Systems Inc 1QB Information Technologies Inc. QxBranch LLC QC Ware Corp. and Research at Google-Google Inc. International Business Machines Corporation Lockheed Martin Corporation Intel Corporation Anyon Systems Inc. Cambridge Quantum Computing Limited Rigetti Computing Magiq Technologies Inc. Station Q Microsoft Corporation IonQ Quantum Computing Software Start-ups Qbit Alibaba Ariste-QB.net Atos Q-Ctrl Qu and Co Quantum Benchmark SAP Turing Zapata

MAJOR TOC OF THE REPORT

Chapter One: Quantum Computing Market Overview

Chapter Two: Manufacturers Profiles

Chapter Three: Global Quantum Computing Market Competition, by Players

Chapter Four: Global Quantum Computing Market Size by Regions

Chapter Five: North America Quantum Computing Revenue by Countries

Chapter Six: Europe Quantum Computing Revenue by Countries

Chapter Seven: Asia-Pacific Quantum Computing Revenue by Countries

Chapter Eight: South America Quantum Computing Revenue by Countries

Chapter Nine: Middle East and Africa Revenue Quantum Computing by Countries

Chapter Ten: Global Quantum Computing Market Segment by Type

Chapter Eleven: Global Quantum Computing Market Segment by Application

Chapter Twelve: Global Quantum Computing Market Size Forecast (2019-2026)

Browse Full Report with Facts and Figures of Quantum Computing Market Report at:https://www.maximizemarketresearch.com/market-report/global-quantum-computing-market/27533/

About Us:

Maximize Market Research provides B2B and B2C market research on 20,000 high growth emerging technologies & opportunities in Chemical, Healthcare, Pharmaceuticals, Electronics & Communications, Internet of Things, Food and Beverages, Aerospace and Defense and other manufacturing sectors.

Contact info:

Name: Lumawant Godage

Organization: MAXIMIZE MARKET RESEARCH PVT. LTD.

Email: sales@maximizemarketresearch.com

Contact: +919607065656/ +919607195908

Website: http://www.maximizemarketresearch.com

Here is the original post:
Global Quantum Computing Market : Industry Analysis and Forecast (2020-2027) - MR Invasion

New way of developing topological superconductivity discovered – Chemie.de

Hybrid material nanowires with pencil-like cross section (A) at low temperatures and finite magnetic field display zero-energy peaks (B) consistent with topological superconductivity as verified by numerical simulations (C).

A pencil shaped semiconductor, measuring only a few hundred nanometers in diameter, is what researches from the Center for Quantum Devices, Niels Bohr Institute, at University of Copenhagen, in collaboration with Microsoft Quantum researchers, have used to uncover a new route to topological superconductivity and Majorana zero modes in a study recently published in Science.

The new route that the researchers discovered uses the phase winding around the circumference of a cylindrical superconductor surrounding a semiconductor, an approach they call "a conceptual breakthrough".

"The result may provide a useful route toward the use of Majorana zero modes as a basis of protected qubits for quantum information. We do not know if these wires themselves will be useful, or if just the ideas will be useful," says Charles Marcus, Villum Kann Rasmussen Professor at the Niels Bohr Institute and Scientific Director of Microsoft Quantum Lab in Copenhagen.

"What we have found appears to be a much easier way of creating Majorana zero modes, where you can switch them on and off, and that can make a huge difference"; says postdoctoral research fellow, Saulius Vaitieknas, who was the lead experimentalist on the study.

The new research merges two already known ideas used in the world of quantum mechanics: Vortex-based topological superconductors and the one-dimensional topological superconductivity in nanowires.

"The significance of this result is that it unifies different approaches to understanding and creating topological superconductivity and Majorana zero modes", says professor Karsten Flensberg, Director of the Center for Quantum Devices.

Looking back in time, the findings can be described as an extension of a 50-year old piece of physics known as the Little-Parks effect. In the Little-Parks effect, a superconductor in the shape of a cylindrical shell adjusts to an external magnetic field, threading the cylinder by jumping to a "vortex state" where the quantum wavefunction around the cylinder carries a twist of its phase.

Charles M. Marcus, Saulius Vaitieknas, and Karsten Flensberg from the Niels Bohr Institute at the Microsoft Quantum Lab in Copenhagen.

What was needed was a special type of material that combined semiconductor nanowires and superconducting aluminum. Those materials were developed in the Center for Quantum Devices in the few years. The particular wires for this study were special in having the superconducting shell fully surround the semiconductor. These were grown by professor Peter Krogstrup, also at the Center for Quantum Devices and Scientific Director of the Microsoft Quantum Materials Lab in Lyngby.

The research is the result of the same basic scientific wondering that through history has led to many great discoveries.

"Our motivation to look at this in the first place was that it seemed interesting and we didn't know what would happen", says Charles Marcus about the experimental discovery, which was confirmed theoretically in the same publication. Nonetheless, the idea may indicate a path forward for quantum computing.

'+_msgObj.message+'

'+_msgObj.message+'

'+_msgObj.message+'

'+_msgObj.message+'

'+_msgObj.message+'

You are currently not logged in to my.chemeurope.com .Your changes will in fact be stored however can be lost at all times.

Originally posted here:
New way of developing topological superconductivity discovered - Chemie.de

Enterprise Quantum Computing Market is Projected to Grow Massively in Near Future with Profiling Eminent Players- Intel Corporation, QRA Corp, D-Wave…

New Study Industrial Forecasts on Enterprise Quantum Computing Market 2020-2026: Enterprise Quantum Computing Market report provides in-depth review of the Expansion Drivers, Potential Challenges, Distinctive Trends, and Opportunities for market participants equip readers to totally comprehend the landscape of the Enterprise Quantum Computing market. Major prime key manufactures enclosed within the report alongside Market Share, Stock Determinations and Figures, Sales, Capacity, Production, Price, Cost, Revenue. The main objective of the Enterprise Quantum Computing industry report is to Supply Key Insights on Competition Positioning, Current Trends, Market Potential, Growth Rates, and Alternative Relevant Statistics.

TheMajorPlayers Covered in this Report: Intel Corporation, QRA Corp, D-Wave Systems, Computing, Cambridge Quantum, QC Ware, QxBranch, Rigetti, IBM Corporation, Quantum Circuits, Google, Microsoft Corporation, Atos SE, Cisco Systems & More.

To get holistic SAMPLE of the report, please click:https://www.reportsmonitor.com/request_sample/905067

The global Enterprise Quantum Computing market is brilliantly shed light upon in this report which takes into account some of the most decisive and crucial aspects anticipated to influence growth in the near future. With important factors impacting market growth taken into consideration, the analysts authoring the report have painted a clear picture of how the demand for Enterprise Quantum Computing Driver could increase during the course of the forecast period. Readers of the report are expected to receive useful guidelines on how to make your companys presence known in the market, thereby increasing its share in the coming years.

Regional Glimpses:The report shed light onthe manufacturing processes, cost structures, and guidelinesand regulations. The regions targeted areEurope, United States, Central & South America, Southeast Asia, Japan, China, and Indiawith their export/import, supply and demand trendswith cost, revenue, and gross margin.The Enterprise Quantum Computing Market is analyzed on the basis of the pricing of the products, the dynamics of demand and supply, total volume produced, and the revenue produced by the products. The manufacturing is studied with respect to various contributors such as manufacturing plant distribution, industry production, capacity, research, and development.

To get this report at a profitable rate @https://www.reportsmonitor.com/check_discount/905067

Major points of the Global Enterprise Quantum Computing Market:

1. The market summary for the global Enterprise Quantum Computing market is provided in context to region, share and market size.2. Innovative strategies used by key players in the market.3. Other focus points in the Global Enterprise Quantum Computing Market report are upcoming opportunities, growth drivers, limiting factors, restrainers, challenges, technical advancements, flourishing segments and other major market trends.4. The comprehensive study is carried by driving market projections and forecast for the important market segments and sub-segments throughout the forecast time period 2020-2026.5. The data has been categorized ans summarized on the basis of regions, companies, types and applications of the product.6. The report has studied developments such as expansions, agreements, latest product launches and mergers in this market.

Reasons to buy the report:

The report would help new entrants as well as established players in the Enterprise Quantum Computing hose market in the following ways:

1. This report segments the Enterprise Quantum Computing market holistically and provides the nearest approximation of the overall, as well as segment-based, market size across different industry, materials, media, and regions.2. The report would support stakeholders in understanding the pulse of the market and present information on key drivers, constraints, challenges, and opportunities for the growth of the market.3. This report would help stakeholders become fully aware of their competition and gain more insights to enhance their position in the business. The competitive landscape section includes competitor ecosystem, along with the product launches and developments; partnerships, agreement, and contracts; and acquisitions strategies implemented by key players in the market.

View this report with a detailed description and TOC @https://www.reportsmonitor.com/report/905067/Enterprise-Quantum-Computing-Market

Any special requirements about this report, please let us know and we can provide custom report.

Contact UsJay MatthewsDirect: +1 513 549 5911 (U.S.)+44 203 318 2846 (U.K.)Email: sales@reportsmonitor.com

View original post here:
Enterprise Quantum Computing Market is Projected to Grow Massively in Near Future with Profiling Eminent Players- Intel Corporation, QRA Corp, D-Wave...

Will This Quantum Computing Breakthrough Save Bitcoin and Cryptocurrency? – The Daily Hodl

A new computing breakthrough may just save Bitcoin and cryptocurrency from powerful quantum machines that have the potential to breach public-key cryptography.

Researchers are following the development of a new measure known as lattice-based cryptography that promises to make crypto technology more quantum-proof, reports MIT Technology Review.

Lattice-based cryptography may neutralize the massive computational capabilities of quantum computers by hiding data inside complex geometric structures that contain a grid of infinite dots that are spread across thousands of dimensions. The security measure appears to be virtually impenetrable even with the use of powerful quantum computers unless one holds the key.

The emergence of quantum computing machines has grabbed headlines over the past few months as the technology poses a threat to cryptographic algorithms that keep cryptocurrencies, like Bitcoin as well as the internet at large secure. The World Economic Forum explains how quantum computers can break current standards of encryption.

The sheer calculating ability of a sufficiently powerful and error-corrected quantum computer means that public-key cryptography is destined to fail, and would put the technology used to protect many of todays fundamental digital systems and activities at risk.

MIT Technology Review says that while the current iterations are not yet ready for implementation, the solution is promising, especially as a post-quantum future is fast approaching. Ripple CTO David Schwartz says he believes developers have at least eight years until the technology, which leverages the properties of quantum physics to perform fast calculations, becomes sophisticated enough to crack cryptocurrency.

I think we have at least eight years. I have very high confidence that its at least a decade before quantum computing presents a threat, but you never know when there could be a breakthrough. Im a cautious and concerned observer, I would say.

Featured Image: Shutterstock/archy13

Link:
Will This Quantum Computing Breakthrough Save Bitcoin and Cryptocurrency? - The Daily Hodl

How The Post-Covid Future Is Unfolding For Technology Ventures – Forbes

Technology is driving the new entrepreneurial economy

As the world eventually pulls its way out of the Covid-19 crisis, were going to see a world changed. Some business are suffering, while others are soaring. Whats it going to take to launch and sustain successful ventures (coming out of garages as well as corporate divisions) over the coming months and years? Many entrepreneurs, venture capitalists and business leaders are also pondering the new directions things will be taking. In the first post in this series, we explored the views of industry innovators on the implications of the Covid-driven rush to digital.

I recently heard from Ubaid Dhiyan, director at Union Square Advisors, who points out how todays innovators are adroitly embracing cloud, artificial intelligence and digital services in new ways. Cloud-driven disruption of large, established industries originally started in media, retail and content consumption, but is now expanding into experiences as well, he says. Prominent examples include Peloton, Mirror, Tonal, and a slew of other fitness related apps paired with hardware, that emphasize user experience.

Business models are evolving in the wake of Covid and clearly in the direction of digital. "Businesses that have thrived through the pandemic may not solely be operating in a digital business model, but what all successful business have in common is a strong digital culture, says Laura Baldwin, president of OReilly Media. Moving forward, we need to face that the future will be more digitally-focused than ever before, and businesses need to start thinking about how to create and implement a digital business model.

Shige Ihara, CEO of NEC X, shared NEC Xs perspective as the innovation accelerator for NECs emerging technologies. "We are seeing several social needs and drivers that have arisen during the pandemic which are stimulating new technology development and areas for potential economic growth," Ihara says. Prominent among these areas are virtual reality and augmented reality. "The market has already seen adoption for military and gaming applications, as well as training for advanced surgery, he explains. As remote work becomes the new normal, we believe VR/AR is a growth area that will provide new and better interfaces for groupware as well as web conferencing. Startups are already developing the VR/AR platforms and software that will enable these improvements.

Dhiyan also sees potential in VR/AR, as well as a host of other cutting-edge technologies, including artificial intelligence, quantum computing and robotics. These are forming the basic technology infrastructure that are supporting business approaches being incubated and launched at a time of financial and economic distress, a politically charged climate, meaningful social discord and a gridlocked legislative environment.

Also, keep an eye on digital currencies, Ihara adds. New digital currencies such as cryptocurrencies have already appeared on the market, but the potential remains for the emergence of an even larger scale, inter-nation cybercurrency system. We believe there is a strong need and great opportunity for new technologies and platforms to enable this shift, but the work in this area is far too nascent for predictions on how it will take shape.

At the same time, Baldwin advises innovators from getting too entangled with a particular technology. With the rapid pace of technological change, what may be impactful today will be replaced by something more impactful in a year, she says. The most important thing is to continue to follow the trends on new technologies as they come to market, build teams that are nimble and flexible enough to adapt to rapid change, and provide them with the tools to build skills and learn new technologies as they come to market. Its that ability to adapt and learn new technologies so that they can be applied that can have impact.

The big winners in the times to come will likely be startups and smaller companies who were born digitally-enabled, says Baldwin. The losers will try to return to the old ways and find themselves swiftly left behind in the new normal."

Read the rest here:
How The Post-Covid Future Is Unfolding For Technology Ventures - Forbes

A Brighter Tomorrow > News > USC Dornsife – USC Dornsife College of Letters, Arts and Sciences

From environment to family, transportation to health care, from work and leisure to what well eat and how well age, USC Dornsife faculty share how they think our future world will look. [11 min read]

As the 19th century drew to a close and a new era dawned, an American civil engineer named John Elfreth Watkins consulted experts at the nations greatest institutions of science and learning for their opinions on 29 wide-ranging topics. Watkins, who was also a contributor to the Saturday Evening Post, then wrote an extraordinary magazine article based on what these university professors told him.

Published on Page 8 of the December 1900 issue of Ladies Home Journal a sister publication of the Post it was titled What May Happen in the Next Hundred Years. Watkins opened the article with the words, These prophecies will seem strange, almost impossible. In fact, many of his far-sighted predictions for the year 2000 which included the invention of digital color photography, television and mobile phones proved remarkably accurate.

For this issue of USC Dornsife Magazine, we have repeated the experiment by inviting 10 scholars drawn from USC Dornsife faculty and representing diverse disciplines to predict what the world will look like in the year 2050 and the year 2100.

A Bluer Planet

Astronauts circling the globe in 80 years may find our blue planet looking quite a bit bluer, says Naomi Levine, assistant professor of biological sciences and Earth sciences.

The middle of the Pacific or Atlantic oceans are what we call the deserts of the ocean. Theyre really low in nutrients, and things that live there are usually small. As a result, these areas look very blue because there isnt much ther except water, Levine explains. As the climate warms, we predict that these desert areas are going to expand. So, ocean waters will look bluer from space.

A Brighter Shade of Green

Our planet may also look a bit greener. Travis Williams, professor of chemistry, says that without an active plan for removing the carbon clogging our atmosphere, nature could step in.

If we dont choose a biomass thats going to utilize higher temperatures and that atmospheric carbon, nature is going to choose on our behalf, and I dont think were going to like it, he says. To avoid harmful organism explosions like algae blooms, Williams foresees a human-led reforestation of the planet, at a scale several times the size of the Amazon rainforest.

What's On the Menu?

A greening planet could also be due to changes in our agricultural systems. A move away from monoculture farming and a return to an ancient polyculture approach might be on the horizon, says Sarah Portnoy, associate professor (teaching) of Spanish. Portnoy researches indigenous food cultures of Mesoamerica and suggests that in the future we could adopt the milpa food system. Animals would be grazing on the same land where there are cover crops and squash, corn, beans and all kinds of herbs growing together, she says.

This isnt just a utopian pipe dream. Governments will have to seriously rethink agriculture if they want to reduce rising rates of chronic disease such as obesity, especially among the poor. The agriculture that is supported by the government now is skewed toward crops like soybeans and wheat. Our food system is geared to the cheapest calories, Portnoy says.

The high-calorie, processed foods produced from these monoculture, subsidized crops are less expensive than fruits and vegetables, but do little for our health. Unless we reprioritize which crops get government cash, we can expect disparities in health between economic classes to continue. By 2050, only the privileged might be able to afford strawberries or carrots.

Food supplies will alter in other ways as well, thanks to climate change. The bluer oceans will be less friendly to bigger marine organisms, which means fewer large fish to harvest.

When you change ocean temperatures, it changes what types of organisms can grow, and that cascades up the food web, says Levine. Sushi chefs in 2050 might dish up more avocados and scallops than tuna rolls. This could work for future diners, Portnoy thinks. Theres a move toward being a lot more intrepid as an eater, and toward plant-based diets, she says.

One Big, Happy Family

Starting off your day in 2050 could mean wheeling your toddler to the state-funded neighborhood day care center. Birth rates are currently plummeting across the industrialized world and governments may soon need to tackle the problem as a public health priority, says Darby Saxbe, associate professor of psychology and director of the USC Center for the Changing Family.

Well realize that, when the birth rate goes down, that affects our future workforce, she says. When were not able to replace our population, it ultimately becomes a national security issue. Child care benefits, family leave and subsidized, part-time work schedules for parents could be the governments strategy to encourage a new baby boom.

We may be well into the digital age, but you might not find too many iPads in the nurseries of the future. Increased awareness of the pitfalls of screen time could change our approach to parenting via device. The original scions of social media themselves now admit to limiting their own childrens time online, observes Saxbe. In fact, in some of the more expensive private schools in Los Angeles, you have to sign a no screen time pledge.

The keywords there might be expensive and private. A movement away from childhood spent online could leave behind children from poorer families as technology becomes cheaper and the cost of human labor rises. It will likely soon be less expensive to instruct classrooms of kids via lessons on tablets than by engaging a human teacher.

You might end up with a two-class system, Saxbe warns. You have more kids having a digital childhood thats a little less regulated, especially in neighborhoods where its not safe to play outside. Wealthier families are going to be able to afford more hands-on child care and more hands-on educational activities, instead of leaving kids alone with their technology.

However, technology can still benefit the family in the coming decades. In fact, Saxbe believes this is a largely untapped opportunity with great potential. Silicon Valley technologists primarily childless young men still havent tackled devices like the breast pump or baby monitor, which could both use a redesign.

Has there been a real focus on innovation and investment when it comes to things that serve parents and families yet? asks Saxbe. I think theres a big market there.

Working 10-4

After dropping your child off at day care, you head to work. You likely wont be putting the keys in the ignition of your own car, though. Kyla Thomas, sociologist at the USC Dornsife Center for Economic and Social Research and director of LABarometer, a quarterly internet-based survey of approximately 1,800 L.A. county residents, says that by 2030 commuters will probably rely more on public transit and shared, autonomous vehicles to get around.

Public transportation will be faster and more convenient, and increased density in neighborhoods will mitigate sprawl. Parking will be more expensive and harder to find. By 2100, Thomas says, private car ownership will be a thing of the past.

Hopping out of your driverless commuter van, you clock in at the office for your six-hour work day. Patricia Grabarek, lecturer with USC Dornsifes Online Master of Science in Applied Psychology program, believes that the traditional 40-hour work week could get phased out by 2050.

We are in the midst of a job revolution thats on the scale of the Industrial Revolution, Grabarek says. The entire nature of work will change.

Automation promises to replace many jobs, and streamline others. Combine this with the growing emphasis on work-life balance, embodied by current millennials pushing for workplace flexibility, and we could see our work week lighten in load.

Our leaders are recognizing the problem that employees are burning out. People are working too much and they are not as productive as they could be. Bosses will start modeling better behaviors for their employees, Grabarek says. After-hours emails could soon be banned, as is already the case in France and Germany.

This doesnt mean well all be aimlessly underemployed, however. There is a fear that automation will eliminate jobs but, in the past, weve always replaced the jobs that weve lost. Innovators will come out and replace them with new jobs we cant even come up with now, she says.

No matter how advanced computers become, human curiosity remains superior. Automation will be good at analyzing data, Grabarek says, but the questions will still originate with human researchers.

It's Quitting Time

Finished with work for the week, youre off to start the weekend. One item not likely to be on the agenda? Attending a traditional religious service.

In the United States, theres a trend away from institutionalized religion and toward highly individualized spirituality, says Richard Flory, associate professor (research) of sociology and senior director of research and evaluation at the USC Dornsife Center for Religion and Civic Culture. People just arent interested in institutions anymore, and nothing seems to be stepping forward to replace that interface between the individual and society.

Churches and temples could find new life as condos, bars or community centers, with religion relegated to a decorative background.

Rather than kneeling in prayer, people might find themselves downing a psychedelic drug to reach personal spiritual enlightenment. Movements that center around hallucinogens such as ayahuasca, a psychoactive tea from the Amazon, have gained traction in recent years, Flory notes.

Of course, there might just be an app for it all. Consciousness hacking aims to use science to bypass years of devotion to a spiritual practice and give everyone the hard-won benefits of such a practice instantly. In the future, I could see having some sort of implanted device to get to this level of consciousness, Flory says.

Reading the Tea Leaves

You may also use your leisure time to crack open a good book one with a slightly different texture. As climate change threatens our traditional resources, more sustainable alternatives such as seaweed could step in as a paper substitute, predicts Mark Marino, professor (teaching) of writing and a scholar of digital literature.

By 2100, literature could be written across the heavens instead.

Roboticist poets will create autonomous micro-texts that will be able to swarm into collectives, self-organize, aggregate and adapt, says Marino. Bevies of these nano-rhy-bots will create superstructures that can write epics on the Great Wall of China, on the surface of Mars or in the bloodstream of their readers.

Better Living Through Quantum Computing

Aging in the New Age may mean more nontraditional family units. Older adults prefer to age and die at home, but what happens when you dont have a big family network to support that? It may mean people might be more invested in friend networks, or the idea of chosen family, says Saxbe. Cue The Golden Girls theme song.

Sean Curran, associate professor of gerontology and biological sciences, believes that a focus on increasing our health span, the period of life during which one is free from serious disease, rather than simply elongating our life spans, will improve the quality of our longer lives as we age.

The goal is to have a personalized approach to aging that takes into account an individuals genetics, environment and life history, explains Curran. The assisted living facility of the future will be patient-centered, with each resident having a personalized prescription to maintain optimal health.

Eli Levenson-Falk, assistant professor of physics and astronomy, predicts that quantum computing could unlock the development of those drugs.

Quantum computers solve problems much more swiftly and with higher information density than todays computers. Although the technology is still in its infancy, Levenson-Falk predicts that by 2050, practical quantum technologies will be used commercially by major drug companies for research and development.

Enormously complicated computational tasks like simulating a chemicals molecular structure are much more achievable through this technology.

The idea is that with a quantum computer you can sort of emulate nature, he explains. We might have the canonical example for this by 2050: the physical shape of a protein molecule.

Predicting this shape is nearly impossible with a classical computer, Levenson-Falk says.

Measuring it is difficult and requires you to predict the shape first. With a good quantum simulator, we can emulate the protein and just let quantum mechanics do the processing for us, then measure the result at the end.

The Quantum Age

Indeed, quantum computing might solve questions that relate to the very fabric of the universe. Or at least get us closer to the answers.

Dark energy, dark matter, quantum gravity and thequantum classical transition are the principle problems existing in physics today. Quantum technologies are the best bet to solve the last one, says Levenson-Falk. Quantum sensors will probably also be used to help detect dark matter, or at least falsify some theories. And there are some proposals for using quantum technologies to poke at quantum gravity.

We cannot, of course, predict our shared future with 100 percent accuracy, but one thing we can be sure of is that it will be filled with new challenges and opportunities to create a better tomorrow. Although advances in technology will certainly help determine our future, how equitably those advances are shared in our interconnected world will also play a dominant role in shaping it.

This is a tale of two societies: You could either see things get better and more supportive for families, or you might see two-class stratification, Saxbe warns.

As the future unspools, we are given both the invaluable gift and the tremendous responsibility of deciding how we want it to look. Whether our world in 2100 takes on the dystopian qualities of Blade Runner or embodies the utopian, egalitarian ideals of Star Trek remains in the terrestrial hands of those already building that future.

Read more:
A Brighter Tomorrow > News > USC Dornsife - USC Dornsife College of Letters, Arts and Sciences

Deep tech may stumble on insufficient computing power – Livemint

It appears that many of the deep tech" algorithms the world is excited about will run into physical barriers before they reach their true promise. Take Bitcoin. A cryptocurrency based on blockchain technology, it has a sophisticated algorithm that grows in complexity, as very few new Bitcoin are mintedthrough a digital process called mining". For a simple description of Bitcoin and blockchain, you could refer to an earlier Mint column of mine.

Bitcoins assurance of validity is achieved by its proving" algorithm, which is designed to continually increase in mathematical complexityand hence the computing power needed to process itevery time a Bitcoin is mined. Individual miners are continually doing work to assess the validity of each Bitcoin transaction and confirm whether it adheres to the cryptocurrencys rules. They earn small amounts of new Bitcoin for their efforts. The complexity of getting several miners to agree on the same history of transactions (and thereby validate them) is managed by the same miners who try outpacing one another to create a valid block".

The machines that perform this work consume huge amounts of energy. According to Digiconomist.net, each transaction uses almost 544KWh of electrical energyenough to provide for the average US household for almost three weeks. The total energy consumption of the Bitcoin network alone is about 64 TWh, enough to provide for all the energy needs of Switzerland. The website also tracks the carbon footprint and electronic waste left behind by Bitcoin, which are both startlingly high. This exploitation of resources is unsustainable in the long run, and directly impacts global warming. At a more mundane level, the costs of mining Bitcoin can outstrip the rewards.

But cryptocurrencies are not the worlds only hogs of computing power. Many Artificial Intelligence (AI) deep learning neural" algorithms also place crushing demands on the planets digital processing capacity.

A neural network" attempts to mimic the functioning of the human brain and nervous system in AI learning models. There are many of these. The two most widely used are recursive neural networks, which develop a memory pattern, and convolutional neural networks, which develop spatial reasoning. The first is used for tasks such as language translation, and the second for image processing. These use enormous computing power, as do other AI neural network models that help with deep learning".

Frenetic research has been going into new chip architectures for these to handle the ever-increasing complexity of AI models more efficiently. Todays computers are binary", meaning they depend on the two simple states of a transistor bitwhich could be either on or off, and thus either a 0 or 1 in binary notation. Newer chips try to achieve efficiency through other architectures. This will ostensibly help binary computers execute algorithms more efficiently. These chips are designed as graphic-processing units, since they are more capable of dealing with AIs demands than central processing units, which are the mainstay of most devices.

In a parallel attempt to get beyond binary computing, firms such as DWave, Google and IBM are working on a different class of machines called quantum computers, which make use of the so-called qubit" , with each qubit able to hold 0 and 1 values simultaneously. This enhances computing power. The problem with these, though, is that they are far from seeing widespread adoption. First off, they are not yet sophisticated enough to manage todays AI models efficiently, and second, they need to be maintained at temperatures that are close to absolute zero (-273 celsius). This refrigeration, in turn, uses up enormous amounts of electrical energy.

Clearly, advances in both binary chip design and quantum computing are not keeping pace with the increasing sophistication of deep tech algorithms.

In a research paper, Neil Thompson of the Massachusetts Institute of Technology and others analyse five widely-used AI application areas and show that advances in each of these fields of use come at a huge cost, since they are reliant on massive increases in computing capability. The authors argue that extrapolating this reliance forward reveals that current progress is rapidly becoming economically, technically and environmentally unsustainable.

Sustained progress in these applications will require changes to their deep learning algorithms and/or moving away from deep learning to other machine learning models that allow greater efficiency in their use of computing capability. The authors further argue that we are currently in an era where improvements in hardware performance are slowing, which means that this shift away from deep neural networks is now all the more urgent.

Thompson et al argue that the economic, environmental and purely technical costs of providing all this additional computing power will soon constrain deep learning and a range of applications, making the achievement of key milestones impossible, if current trajectories hold.

We are designing increasingly sophisticated algorithms, but we dont yet have computers that are sophisticated enough to match their demands efficiently. Without significant changes in how AI models are built, the usefulness of AI and other forms of deep tech is likely to hit a wall soon.

Siddharth Pai is founder of Siana Capital, a venture fund management company focused on deep science and tech in India

Subscribe to newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Originally posted here:
Deep tech may stumble on insufficient computing power - Livemint