Quix checks off another condition to build universal quantum computer – Bits&Chips

Researchers using Quix Quantums technology have successfully demonstrated the on-chip generation of so-called Greenberger-Horne-Zeilinger (GHZ) states, a critical component for the advancement of photonic quantum computing. The Dutch startup focusing on photonics-based quantum computing hails the result as a breakthrough that validates the companys roadmap towards building a scalable universal quantum computer.

The creation of GHZ states is necessary for photonic quantum computers. In a matter-based quantum computer, qubits are stationary, typically positioned on a specialized chip. By contrast, a photonic quantum computer uses flying qubits of light to process and transmit information. This information is constantly passed from one state to another through a process called quantum teleportation. The GHZ states entanglements across three photonic qubits are the crucial resource enabling the computer to maintain this information.

This milestone demonstrates the capability of photonic quantum computers to generate multi-photon entanglement in a way that advances the roadmap toward large-scale quantum computation. The generation of GHZ states is evidence of the transformative potential of Quix Quantums photonic quantum computing technology, commented CEO Stefan Hengesbach of Quix.

Quix next challenge is now making many of these devices. When comparing one GHZ state to a million GHZ states, think of it as the spark needed to create a blazing fire. The more GHZ states a photonic quantum computer contains, the more powerful it becomes, added Chief Scientist Jelmer Renema.

More here:
Quix checks off another condition to build universal quantum computer - Bits&Chips

NIST quantum-resistant algorithms to be published within weeks, top White House advisor says – The Record from Recorded Future News

Update, May 24: Includes correction from NIST about the number of algorithms to be released.

The U.S. National Institute of Standards and Technology (NIST) will release post-quantum cryptographic algorithms in the next few weeks, a senior White House official said on Monday.

Anne Neuberger, the White Houses top cyber advisor, told an audience at the Royal United Services Institute (RUSI) in London that the release of the algorithms was a momentous moment, as they marked a major step in the transition to the next generation of cryptography.

The transition is being made in apprehension of what is called a cryptographically relevant quantum computer (CRQC), a device theoretically capable of breaking the encryption thats at the root of protecting both corporate and national security secrets, said Neuberger. NIST made a preliminary announcement of the algorithms in 2022.

Following publication, a spokesperson for NIST told Recorded Future News it was planning to release three finalized algorithms this summer and not four, as Neuberger had said in London.

Conrad Prince, a former official at GCHQ and now a distinguished fellow at RUSI, told Neuberger that during his previous career there had consistently been a concern about hostile states having the capability to decrypt the plaintext of secure messages, although this capability was consistently estimated at being roughly a decade away and had been for the last 20 years.

Neuberger said the U.S. intelligence communitys estimate is similar, the early 2030s, for when a CRQC would be operational. But the time-frame is relevant, said the White House advisor, because there is national security data that is collected today and even if decrypted eight years from now, can still be damaging.

Britains NCSC has warned that contemporary threat actors could be collecting and storing intelligence data today for decryption at some point in the future.

Given the cost of storing vast amounts of old data for decades, such an attack is only likely to be worthwhile for very high-value information, stated the NCSC. As such, the possibility of a CRQC existing at some point in the next decade is a very relevant threat right now.

Neuberger added: Certainly theres some data thats time sensitive, you know, a ship that looks to be transporting weapons to a sanctioned country, probably in eight years we dont care about that anymore.

Publishing the new NIST algorithms is a protection against adversaries collecting the most sensitive kinds of data today, Neuberger added.

A spokesperson for NIST told Recorded Future News: The plan is to release the algorithms this summer. We dont have anything more specific to offer at this time.

But publishing the algorithms is not the last step in moving to a quantum-resistant computing world. The NCSC has warned it is actually just the second step in what will be a very complicated undertaking.

Even if any one of the algorithms proposed by NIST achieves universal acceptance as something that is unbreakable by a quantum computer, it would not be a simple matter of just swapping those algorithms in for the old-fashioned ones.

Part of the challenge is that most systems that currently depend on public-key cryptography for their security are not necessarily capable of running the resource-heavy software used in post-quantum cryptography.

Ultimately, the security of public key cryptographic systems relies on the mathematical difficulty of factoring very large prime numbers something that traditional computers find exhaustingly difficult.

However, research by American mathematician Peter Shor, published in 1994, proposed an algorithm that could be run on a quantum computer for finding these prime factors with far more ease; potentially undermining some of the key assumptions about what makes public-key cryptography secure.

The good news, according to NCSC, is that while advances in quantum computing are continuing to be made, the machines that exist today are still limited, and suffer from relatively high error rates in each operation they perform, stated the agency.

But the NCSC warned that in the future, it is possible that error rates can be lowered such that a large, general-purpose quantum computer could exist, but it is impossible to predict when this may happen.

Recorded Future

Intelligence Cloud.

No previous article

No new articles

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.

See original here:
NIST quantum-resistant algorithms to be published within weeks, top White House advisor says - The Record from Recorded Future News

Alice & Bob’s Cat Qubit Research Published in Nature – HPCwire

PARIS and BOSTON, May 23, 2024 Alice & Bob, a global leader in the race for fault-tolerant quantum computing, today announced the publication of its foundational research in Nature, showcasing significant advancements in cat qubit technology.

The study, Quantum control of a cat-qubit with bit-flip times exceeding ten seconds, realized in collaboration with the QUANTIC Team (Mines Paris PSL, Ecole Normale Suprieure and INRIA), demonstrates an unprecedented improvement in the stability of superconducting qubits, marking a critical milestone towards useful fault-tolerant quantum computing.

The researchers have significantly extended the bit-flip times from milliseconds to tens of secondsthousands of times better than any other superconducting qubit type.

Quantum computers face two types of errors: bit-flips and phase-flips. Cat qubits exponentially reduce bit-flips, which are analogous to classical bit flips in digital computing. As a result, the remaining phase-flips can be addressed more efficiently with simpler error correcting codes.

The researchers used Alice & Bobs Boson 3 chipset for this record-breaking result, which features a cat qubit design named TomCat. TomCat employs an efficient quantum tomography (measurement) protocol that allows for the control of quantum states without the use of a transmon, a common circuit used by many quantum companies, but one of the major sources of bit-flips for cat qubits. This design also minimizes the footprint of the qubit on the chip, removing drivelines, cables, instruments, making this stable qubit scalable. Recently, Alice & Bob made publicly available their new Boson 4 chipset that reaches over 7 minutes of bit-flip lifetime. The results from this Nature Publication can therefore be reproduced by users on Boson 4 over Google Cloud.

Although Alice & Bobs latest Boson chips are getting closer to the company bit-flip protection targets, Alice & Bob plans to further advance their technology. The next iterations will focus on boosting the cat qubit phase-flip time and readout fidelity to reach the requirements of their latest architecture to deliver a 100 logical qubit quantum computer.

Key advances highlighted in the research include:

About Alice & Bob

Alice & Bob is a quantum computing company based in Paris and Boston whose goal is to create the first universal, fault-tolerant quantum computer. Founded in 2020, Alice & Bob has already raised 30 million in funding, hired over 95 employees and demonstrated experimental results surpassing those of technology giants such as Google or IBM. Alice & Bob specializes in cat qubits, a pioneering technology developed by the companys founders and later adopted by Amazon. Demonstrating the power of its cat architecture, Alice & Bob recently showed that it could reduce the hardware requirements for building a useful large-scale quantum computer by up to 200 times compared with competing approaches. Alice & Bob cat qubit is available for anyone to test through cloud access.

Source: Alice & Bob

Read this article:
Alice & Bob's Cat Qubit Research Published in Nature - HPCwire

How Nvidia co-founder plans to turn Hudson Valley into a tech powerhouse greater than Silicon Valley – New York Post

A co-founder of chip maker Nvidia is bankrolling a futuristic quantum computer system at Rensselaer Polytechnic Institute and wants to turn New Yorks Hudson Valley into a tech powerhouse.

Curtis Priem, 64, donated more than $75 million so that the Albany-area college could obtain the IBM-made computer the first such device on a university campus anywhere in the world, the Wall Street Journal reported.

The former tech executive and RPI alum said his goal is to establish the area around the school, based in Troy, into a hub of talent and business as quantum computing becomes more mainstream in the years ahead.

Weve renamed Hudson Valley as Quantum Valley, Priem told the Journal. Its up to New York whether they want to become Silicon State not just a valley.

The burgeoning technology uses subatomic quantum bits, or qubits, to process data much faster than conventional binary computers. The devices are expected to play a key role in the development of advanced AI systems.

Priem will reportedly fund the whopping $15 million per year required to rent the computer, which is kept in a building that used to be a chapel on RPIs campus.

RPI PresidentMartin Schmidt told the newspaper that the school will begin integrating the device into its curriculum and ensure it is accessible to the student body.

Representatives for IBM and RPI did not immediately return The Posts request for comment.

An electrical engineer by trade, Priem co-founded Nvidia alongside its current CEO Jensen Huang and Chris Malachowsky in 1993. He served as the companys chief technology officer until retiring in 2003.

Priem sold most of his stock in retirement and used the money to start a charitable foundation.

He serves as vice chair of the board at RPI and has reportedly donated hundreds of millions of dollars to the university.

Nvidia has surged in value as various tech firms rely on its computer chips to fuel the race to develop artificial intelligence.

The companys stock has surged 95% to nearly $942 per share since January alone. Nvidias market cap exceeds $2.3 trillion, making it the worlds third-most valuable company behind Microsoft and Apple.

In November 2023, Forbes estimated that Priem would be one of the worlds richest people, with a personal fortune of $70 billion, if he hadnt sold off most of his Nvidia shares.

Go here to see the original:
How Nvidia co-founder plans to turn Hudson Valley into a tech powerhouse greater than Silicon Valley - New York Post

Exploring new frontiers with Fujitsu’s quantum computing research and development – Fujitsu

Fujitsu and RIKEN have already successfully developed a 64-qubit superconducting quantum computer at the RIKEN-RQC-Fujitsu Collaboration Center, which was jointly established by the two organizations (*1). Our interviewee, researcher Shingo Tokunaga, is currently participating in a joint research project with RIKEN. He majored in electronic engineering at university and worked on microwave-related research topics. After joining Fujitsu, he worked in a variety of software fields, including network firmware development as well as platform development for communication robots. Currently, he is applying his past experience in the Quantum Hardware Team at the Quantum Laboratory to embark on new challenges.

In what fields do you think quantum computing can be applied to?

ShingoQuantum computing has many potential applications, such as finance and healthcare, but especially in quantum chemistry calculations used in drug development. If we can use it for these calculations, we can realize efficient and high precision simulations in a short period of time. Complex calculations that traditionally take a long time to solve on conventional computers are expected to be solved quickly by quantum computers. One such example of this is finding solutions for combinatorial optimization problems such as molecular structure patterns. The spread of the novel coronavirus has made the development of vaccines and therapeutics urgent, and in such situations where rapid responses are needed, I believe the time will come when quantum computers can be utilized.

Fujitsu is collaborating with world-leading research institutions to advance research and development in all technology areas, from quantum devices to foundational software and applications, with the aim of realizing practical quantum computers. Additionally, we are also advancing the development of hybrid technologies (*2) for quantum computers and high-performance computing technologies, represented by the supercomputer Fugaku, which will be necessary for large-scale calculations until the full practicality of quantum computers is achieved.

What themes are you researching? What are your challenges and goals?

ShingoOne of the achievements of our collaborative research with RIKEN is the construction of a 64-qubit superconducting quantum computer. Superconducting quantum computers operate by manipulating quantum bits on quantum chips cooled to under 20 mK using ultra-low-temperature refrigerators, driving them with microwave signals of around 8 GHz, and reading out the state of the bits. However, since both bit operations and readouts are analog operations, errors are inherent. Our goal is to achieve higher fidelity in the control and readout of quantum bits, providing an environment where quantum algorithms can be executed with high computational accuracy, ultimately solving our customers' challenges.

What role do you play in the team?

ShingoThe Quantum Hardware Team consists of many members responsible for tasks such as designing quantum chips, improving semiconductor manufacturing processes, designing and constructing components inside refrigerators, as well as designing and constructing control devices outside refrigerators. I am responsible for building control devices and controlling quantum bits. While much attention is often given to the development of the main body of quantum computers or quantum chips, by controlling and reading quantum bits with high precision, we can deliver the results of the development team to users, and that's my role.

How do you carry out controlling quantum bits, and in what sequence or process?

ShingoThe first step is the basic evaluation of the quantum chip, followed by calibration for controlling the quantum bits. First, we receive the quantum chip from the manufacturing team and perform performance measurements. To evaluate the chip, it is placed inside the refrigerator, and after closing the cover of the refrigerator, which is multilayered for insulation, the inside is vacuumed and cooling begins. It usually takes about two days to cool from room temperature to 20 mK. In the basic evaluation, we confirm parameters such as the resonance frequency of the quantum bits and coherence time called T1(the time it takes for a qubit to become initialized). Then, we perform calibration for quantum bit operations and readouts. Bit operations and readouts may not always yield the desired results, because there are interactions between the bits. The bit to be controlled may be affected by the neighboring bits, so it is necessary to control based on the overall situation of the bits. Therefore, we investigate why the results did not meet expectations, consult with researchers at RIKEN, and make further efforts to minimize errors.

How do you approach the challenge of insufficient accuracy in bit operations and readouts?

ShingoThere are various approaches we can try, such as improving semiconductor processes, implementing noise reduction measures in control electronics, and changing the method of microwave signal irradiation. Our team conducts studies on the waveform, intensity, phase, and irradiation timing of microwave signals necessary to improve the accuracy of quantum bit control. Initially, we try existing methods described in papers on our quantum chip and then work to improve accuracy further from there.

What other areas do you focus on or innovate in, outside of your main responsibilities? Can you also explain the reasons for this?

ShingoI am actively advancing tasks to contribute to improving the performance of quantum computer hardware further. The performance of the created quantum chip can only be evaluated by cooling it in a refrigerator and conducting measurements. Based on these results, it is important to determine what is needed to improve the performance of quantum computer hardware and provide feedback to the quantum chip design and manufacturing teams.

For Fujitsu, the development of quantum computers marks a first-time challenge. Do you have any concerns?

ShingoI believe that venturing into unknown territories is precisely where the value of a challenge lies, presenting opportunities for new discoveries and growth. Fujitsu is tackling quantum computer research and development by combining various technologies it has cultivated over the years. I aim to address challenges one by one and work towards achieving stable operation. Once stable operation is achieved, I hope to conduct research on new control methods.

What kind of activities you are undertaking to accelerate your research on quantum computers?

ShingoQuantum computing is an unknown field even for myself, so I am advancing development while consulting with researchers at RIKEN, our collaborative research partner. I aim to build a relationship of give and take, so I actively strive to cooperate if there are ways in which I can contribute to RIKEN's research.

What is your outlook for future research?

ShingoUltimately, our goal is to utilize quantum computers to solve societal issues, but quantum computing is still in its early stages of development. I believe that it is the responsibility of our Quantum Hardware Team urgently to provide application development teams with qubits and quantum gates that have many bits and high fidelity. In particular, fidelity improvement in two-qubit gate operations is a challenge in the field of control, and I aim to work on improving it. Additionally, I want to explore the development of a quantum platform that allows customers to maximize their utilization of quantum computers.

We use technology to make peoples lives happier. As a result of this belief, we have created various technologies and contributed to the development of society and our customers. At the Fujitsu Technology Hall located in the Fujitsu Technology Park, you can visit mock-ups of Fujitsu's quantum computers, as well as experience the latest technologies such as AI.

Mock-up of a quantum computer exhibited at the Fujitsu Technology Hall

Read more:
Exploring new frontiers with Fujitsu's quantum computing research and development - Fujitsu

Microsoft says it’s cracked the code on an important quantum computing problem – The Verge

Microsoft says its figured out how to improve error rates in quantum computing, bringing quantum computing closer to a commercial state.

The company worked in collaboration with quantum computing hardware maker Quantinuum to improve the performance of the qubit the very basic unit of quantum computing. Qubits work by holding two different phases at once (instead of just a one and a zero, its both), but they arent very stable, making it easy for them to lose data. Researchers can now create several logical qubits, or qubits that are more stable while holding these different states.

Krysta Svore, vice president of advanced quantum development at Microsoft, told The Verge in an interview that because qubits are prone to errors, researchers needed to find a way to stabilize them.

We need reliable quantum computing, and not just in theory; we need to demonstrate that it can work in practice, Svore says. I like to think of it as putting noise-cancelling headphones on the qubits.

She says that these more reliable qubits help quantum computing graduate from level one, the more foundational level with qubits prone to mistakes and are usually referred to as noisy, to the next level, where scientists can run more calculations correctly and scale up the technology for more commercial use.

Other quantum computing experts welcomed Microsoft and Quantinuums advancement. Henry Yuen, associate professor of computer science at Columbia and a theoretical computer scientist, tells the Verge via email this may just be the beginning of more discoveries that make quantum computing easier.

Were far from the final destination, but the signposts are getting more frequent and are indicating that some major milestones are coming up soon, Yuen says. Im sure there will be bigger and better demonstrations of quantum fault tolerance coming soon.

Microsoft brought its qubit-virtualization system, which Svore says abstracts groups of physical qubits together, to Quantiuums quantum computer to create virtual logical qubits.

With it, users could create qubits with a longer fault tolerance, or time without encountering an error. The team created four reliable logical qubits from only 30 physical qubits. Previously, the scientific consensus was that hundreds of physical qubits were needed to make a couple of logical qubits that didnt fail, and they would have taken decades to create.

The teams ran 14,000 calculations without losing the quantum state and found they improved the error rate by a factor of 800 over physical qubits. Svore says the system could detect and fix errors without destroying the logical qubit and keeping the string of calculations going.

Microsoft is now figuring out how to bring this capability to Azure Quantum Elements, its platform for scientists to use AI, high-performance computing, and quantum computing to run scientific experiments.

Yuen says that while he thinks the term quantum virtualization may be Microsofts branding for error-correcting code, its findings could be scalable for other quantum computing companies to try on their own.

Quantum computing has always seemed like far into the future innovation, despite the idea and experimentation being around for decades. Companies such as IBM, Microsoft, and Google have been trying to make quantum computing reliable, safe, cost-effective, and, more importantly, useful for years.

Quantinuum chief product officer Ilyas Khan and senior director of offering management Jenni Strabley said in a blog post that they plan to continue improving the system to create more reliable logical qubits.

In the short term with a hybrid supercomputer powered by a hundred reliable logical qubits, we believe that organizations will be able to start to see scientific advantages and will be able to accelerate valuable progress toward some of the most important problems that mankind faces such as modeling the materials used in batteries and hydrogen fuel cells or accelerating the development of meaning-aware AI language models, Quantinuum said in its post.

Now, with Microsoft and Quantiuums work, its up to others to see if they can replicate the same thing.

See more here:
Microsoft says it's cracked the code on an important quantum computing problem - The Verge

Glimpse of next-generation internet – Harvard Office of Technology Development

May 20th, 2024

By Anne Manning, Harvard Staff Writer Published in the Harvard Gazette

An up close photo of the diamond silicon vacancy center.

Its one thing to dream up a next-generation quantum internet capable of sending highly complex, hacker-proof information around the world at ultra-fast speeds. Its quite another to physically show its possible.

Thats exactly what Harvard physicists have done, using existing Boston-area telecommunication fiber, in a demonstration of the worlds longest fiber distance between two quantum memory nodes. Think of it as a simple, closed internet carrying a signal encoded not by classical bits like the existing internet, but by perfectly secure, individual particles of light.

The groundbreaking work, published in Nature, was led by Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics, in collaboration with Harvard professors Marko Lonar and Hongkun Park, who are all members of the Harvard Quantum Initiative. The Nature work was carried out with researchers at Amazon Web Services.

The Harvard team established the practical makings of the first quantum internet by entangling two quantum memory nodes separated by optical fiber link deployed over a roughly 22-mile loop through Cambridge, Somerville, Watertown, and Boston. The two nodes were located a floor apart in Harvards Laboratory for Integrated Science and Engineering.

Showing that quantum network nodes can be entangled in the real-world environment of a very busy urban area is an important step toward practical networking between quantum computers.

Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics

Quantum memory, analogous to classical computer memory, is an important component of a quantum computing future because it allows for complex network operations and information storage and retrieval. While other quantum networks have been created in the past, the Harvard teams is the longest fiber network between devices that can store, process, and move information.

Each node is a very small quantum computer, made out of a sliver of diamond that has a defect in its atomic structure called a silicon-vacancy center. Inside the diamond, carved structures smaller than a hundredth the width of a human hair enhance the interaction between the silicon-vacancy center and light.

The silicon-vacancy center contains two qubits, or bits of quantum information: one in the form of an electron spin used for communication, and the other in a longer-lived nuclear spin used as a memory qubit to store entanglement, the quantum-mechanical property that allows information to be perfectly correlated across any distance.

(In classical computing, information is stored and transmitted as a series of discrete binary signals, say on/off, that form a kind of decision tree. Quantum computing is more fluid, as information can exist in stages between on and off, and is stored and transferred as shifting patterns of particle movement across two entangled points.)

Map showing path of two-node quantum network through Boston and Cambridge. Credit: Can Knaut via OpenStreetMap

Using silicon-vacancy centers as quantum memory devices for single photons has been a multiyear research program at Harvard. The technology solves a major problem in the theorized quantum internet: signal loss that cant be boosted in traditional ways.

A quantum network cannot use standard optical-fiber signal repeaters because simple copying of quantum information as discrete bits is impossible making the information secure, but also very hard to transport over long distances.

Silicon-vacancy-center-based network nodes can catch, store, and entangle bits of quantum information while correcting for signal loss. After cooling the nodes to close to absolute zero, light is sent through the first node and, by nature of the silicon vacancy centers atomic structure, becomes entangled with it, so able to carry the information.

Since the light is already entangled with the first node, it can transfer this entanglement to the second node, explained first author Can Knaut, a Kenneth C. Griffin Graduate School of Arts and Sciences student in Lukins lab. We call this photon-mediated entanglement.

Over the last several years, the researchers have leased optical fiber from a company in Boston to run their experiments, fitting their demonstration network on top of the existing fiber to indicate that creating a quantum internet with similar network lines would be possible.

Showing that quantum network nodes can be entangled in the real-world environment of a very busy urban area is an important step toward practical networking between quantum computers, Lukin said.

A two-node quantum network is only the beginning. The researchers are working diligently to extend the performance of their network by adding nodes and experimenting with more networking protocols.

The paper is titled Entanglement of Nanophotonic Quantum Memory Nodes in a Telecom Network. The work was supported by the AWS Center for Quantum Networkings research alliance with the Harvard Quantum Initiative, the National Science Foundation, the Center for Ultracold Atoms (an NSF Physics Frontiers Center), the Center for Quantum Networks (an NSF Engineering Research Center), the Air Force Office of Scientific Research, and other sources.

Harvard Office of Technology Development enabled the strategic alliance between Harvard University and Amazon Web Services (AWS) to advance fundamental research and innovation in quantum networking.

Tags: Alliances, Collaborations, Quantum Physics, Internet, Publication

Press Contact: Kirsten Mabry | (617) 495-4157

See the rest here:
Glimpse of next-generation internet - Harvard Office of Technology Development

D-Wave’s Qubits 2024 Quantum Computing Conference Announced for June 17-18 in Boston – HPCwire

PALO ALTO, Calif., April 9, 2024 D-Wave Quantum Inc., a leader in quantum computing systems, software, and services and the worlds first commercial supplier of quantum computers, today announced that its Qubits 2024 quantum computing conference will take place in Boston on June 17 and 18, 2024.

Themed Success, Powered by Quantum, the conference will demonstrate how D-Wave, partners, and customers such as Momentum Worldwide (part of Interpublic Group), Los Alamos National Lab, Zapata AI and others are achieving tangible outcomes with D-Waves innovative annealing quantum computing technology.

The two-day conference will focus on the impact of D-Waves quantum-powered technologies, as they address highly complex problems in areas such as supply chain logistics, manufacturing, government, and life sciences. It will be packed with demonstrations of new quantum solutions, product updates, the companys latest scientific accomplishments, and customer applications, both in development and in production.

The conference will highlight:

We are thrilled to bring Qubits to the global innovation hub of Boston this year, where we will share the incredible momentum were seeing as our quantum technologies cross the chasm from experimentation to operational use, said Dr. Alan Baratz, CEO of D-Wave. This is the must-attend event of the year for anyone looking to understand how todays quantum technology is transforming business, especially as it merges with AI to fuel the next generation of groundbreaking applications.

For those unable to attend in person, D-Wave will offer a free livestream of the first days morning talks, allowing participants worldwide to engage with the conference content virtually.

To register for either the live or virtual event, visit: http://www.qubits.com.

About D-Wave Quantum Inc.

D-Wave is a leader in the development and delivery of quantum computing systems, software, and services, and is the worlds first commercial supplier of quantum computersand the only company building both annealing quantum computers and gate-model quantum computers. Our mission is to unlock the power of quantum computing today to benefit business and society. We do this by delivering customer value with practical quantum applications for problems as diverse as logistics, artificial intelligence, materials sciences, drug discovery, scheduling, cybersecurity, fault detection, and financial modeling. D-Waves technology has been used by some of the worlds most advanced organizations including Mastercard, Deloitte, Davidson Technologies, ArcelorMittal, Siemens Healthineers, NEC Corporation, Pattison Food Group Ltd., DENSO, Lockheed Martin, Forschungszentrum Jlich, University of Southern California, and Los Alamos National Laboratory.

Source: D-Wave

See more here:
D-Wave's Qubits 2024 Quantum Computing Conference Announced for June 17-18 in Boston - HPCwire

The 3 Best Quantum Computing Stocks to Buy in April 2024 – InvestorPlace

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Quantum computing will bring about the next computing revolution that will overshadow the prescient artificial intelligence (AI) craze. There are certain kinds of problems that are effectively impossible or inefficient for conventional, classical computers to solve, but not for quantum computers. This has led many investors to seek out the best quantum computing stocks to buy.

Novel quantum computers can be a gamechanger in terms ofcurrent cryptographymethods and could also allow for the introduction of completely private communication. Complex problems in optimization, machine learning and simulation will also become solvable with quantum computing.

Investors who are already looking for the next market sensation are considering a few names in the nascent quantum computing space. Wall Street has caught wind of some of the quantum computing names that could make successful plays in the long term. Below are three suchquantum computing stocks.

Source: JHVEPhoto / Shutterstock.com

International Business Machines(NYSE:IBM), one of the most established companies in the tech industry, has been working on quantum computers since the early 2000s. For example, in 2001, IBM researchers had already been performingquantum computing techniques to solve cryptography problems. IBMs quantum computer consists of superconducting qubits that operate at near-zero temperatures. The tech giant also offers a cloud-based quantum computing service calledIBM Quantum Experience, which allows customers and researchers to access its quantum hardware and software through the cloud rather than spending a lot of cash to buy a physical quantum computer.

In recent years, IBM has endured single-digit revenue growth, including in 2023, but the tech giant has continued to beat estimates in their recent Q42023 earnings report. Both revenue and earnings figures came above what Wall Street analysts had projected; IBM also found itself flush with more free cash flow than it had anticipated. More breakthroughs in quantum computing could spur revenue growth in the future. IBM expects to it wont have a practical quantum computer until the end of the decade, which makes IBM a compelling long-term hold.

Source: Amin Van / Shutterstock.com

IonQ(NYSE:IONQ) happens to be the first pure-playamongpublicly tradedquantum computing stocks and it will be the only pure-play quantum computing player to make this list. The company is a leader in trapped-ion quantum computing, which uses electrically charged atoms to store and manipulate qubits.

To date, the company claims to have built the worldsmost powerful quantum computerwhich has achieved a quantum capacity 32 qubits.IonQplans to launchmodular quantum computers by the end of 2023. To make the computing power of its quantum computers more accessible, IonQ has made its quantum computing power accessible to customers and developers through large cloud platforms.

IonQ ended 2023 with another successful quarter. Fourth-quarterearnings resultssaw the quantum computing firm generate full-year revenue figures well above the high end of its guidance range. This was the same for bookings. IonQ also announced the production of its Enterprise Forte quantum computer in its Seattle manufacturing facility. Deliveries for these quantum systems are slated for the end of 2024.

IonQs shares are down almost 25% on a year-to-date basis, which could make a good entry point for new investors or investors willing to increase their investment. Quantum computing, similar to generative AI, has the potential to be the next big thing in technology, and IonQ is at the forefront of the space. You can see why this made our list of the best quantum computing stocks to buy.

Source: salarko / Shutterstock.com

Flush with cash,Alphabet(NASDAQ:GOOG,GOOGL) has made a variety of investments in various computing technologies over the past two decades. The company also has been developing quantum computerssince 2006and achieved a milestone in 2019 when itdemonstrated quantum supremacy,or the ability of a quantum computer to perform a task that is impossible for a classical computer.

Googles quantum computer, called Sycamore, used 54 qubits to perform a calculation in 200 seconds that would take a supercomputer much longer to complete. Althoughsome researchers have claimedto be able to do what Googles Sycamore had done by using a normal supercomputer, Google is continuously working on improving its quantum hardware, software and algorithms. And the results are promising. The new version of Sycamore apparently can make calculations that would take supercomputers47 years to complete.

In order to bring about more use-cases for quantum computing, Google has launched a 3-year competition with a $5 million prize for researchers who can come up with new quantum algorithms that can solve existing problems humanity faces. This kind of investment could definitely help steer the new sector in the right direction.

In its Q42023 earnings report, cloud continued to be thecompanys growth engine, growing 26% on a year over year (YOY). In the long term, quantum computing could be an even bigger growth engine for Google. If you are looking for the best quantum computing stocks to buy, start here.

On the date of publication, Tyrik Torres did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Tyrik Torres has been studying and participating in financial markets since he was in college, and he has particular passion for helping people understand complex systems. His areas of expertise are semiconductor and enterprise software equities. He has work experience in both investing (public and private markets) and investment banking.

Read the original post:
The 3 Best Quantum Computing Stocks to Buy in April 2024 - InvestorPlace

Microsoft makes major quantum computing breakthrough development of most stable qubits might actually make the … – TechRadar

Unlike traditional computing that uses binary bits, quantum computing uses quantum bits or 'qubits', enabling simultaneous processing of vast amounts of data, potentially solving complex problems much faster than conventional computers.

In a major step forward for quantum computing, Microsoft and Quantinuum have unveiled the most reliable logical qubits to date, boasting an error rate 800 times lower than physical qubits.

This groundbreaking achievement involved running over 14,000 individual experiments without a single error, which could make quantum computing a viable technology for various industries.

Microsoft says the successful demonstration was made possible by applying its innovative qubit-virtualization system (coupled with error diagnostics and correction) to Quantinuum's ion-trap hardware. Jason Zander, EVP of Strategic Missions and Technologies at Microsoft, says, "This finally moves us out of the current noisy intermediate-scale quantum (NISQ) level to Level 2 Resilient quantum computing."

The potential of this advancement is enormous. As Zander says, With a hybrid supercomputer powered by 100 reliable logical qubits, organizations would start to see the scientific advantage, while scaling closer to 1,000 reliable logical qubits would unlock commercial advantage.

Quantum computing holds enormous promise for solving some of society's most daunting challenges, including climate change, food shortages, and the energy crisis. These issues often boil down to complex chemistry and materials science problems, which classical computing struggles to handle but which would be far easier for Quantum computers to manage.

The task now, Microsoft says, is to continue improving the fidelity of qubits and enable fault-tolerant quantum computing. This will involve transitioning to reliable logical qubits, a feat achieved by merging multiple physical qubits to protect against noise and sustain resilient computation.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

While the technology's potential is immense, its widespread adoption will depend on its accessibility and cost-effectiveness. For now, though, Microsoft and Quantinuum's breakthrough marks a significant step towards making quantum computing a practical reality.

Go here to see the original:
Microsoft makes major quantum computing breakthrough development of most stable qubits might actually make the ... - TechRadar

Microsoft and Quantinuum report a way to turn down the noise in quantum computing – GeekWire

Quantinuum scientists make adjustments to a beam-line array used to deliver laser pulses in quantum computers. (Quantinuum Photo)

Microsoft and Quantinuum say theyve demonstrated a quantum computing system that can reduce the error rate for data processing by a factor of 800.

Today signifies a major achievement for the entire quantum ecosystem, Jason Zander, Microsofts executive vice president for strategic missions and technologies, said in a blog posting about the achievement.

Quantum computing could solve certain types of problems ranging from data encryption and system optimization to the development of new synthetic materials on a time scale that would be unachievable using classical computers. Scaled quantum computers would offer the ability to simulate the interactions of molecules and atoms at the quantum level beyond the reach of classical computers, unlocking solutions that can be a catalyst for positive change in our world, Zander said.

The secret to success lies in quantum bits, or qubits, that can represent multiple values until the results of a computation are read out. Qubits typically make use of exotic materials, such as superconducting circuits, diamonds with defects or laser-cooled ions.

One big challenge is that qubits tend to be noisy that is susceptible to perturbations that introduce errors. For years, researchers have been hunting for ways to maintain the fidelity of qubits and correct any errors that arise. Such strategies typically involve linking up multiple physical qubits to represent a single logical qubit.

Just a couple of years ago, Microsoft researchers were saying that a quantum computer would need at least a million physical qubits in order to demonstrate an advantage over classical computers. But thats because it was thought that thousands of physical qubits would be required to produce a single logical qubit. If fewer physical qubits are required for error correction, that would make it easier to build useful quantum computers.

The newly reported demonstration addresses that challenge: Microsoft and Quantinuum said they created four highly reliable logical qubits from just 30 physical qubits.

With this system, we ran more than 14,000 individual experiments without an error, Zander said.

In a technical blog posting, Microsofts Dennis Tom and Krysta Svore wrote that they used a qubit-virtualization system to improve the reliability of Quantinuums ion-trap hardware by a factor of 800. Tom is general manager of Azure Quantum, and Svore is Microsofts vice president of advanced quantum development.

An 800x improvement in error rate corresponds to a 29 dB improvement of signal, which is the same as that achieved with a high-quality noise-canceling headset, Tom and Svore said.

The comparison is particularly apt: Activating the noise-canceling function on the headphones to listen to music, while removing most of the environmental noise, is akin to applying our qubit-virtualization system, the researchers said.

Microsoft takes a hybrid approach to cloud-based computing, which combines the strengths of classical supercomputing and quantum processing. Zander said the application of Microsofts qubit-virtualization system moves us out of the current noisy intermediate-scale quantum (NISQ) level to Level 2 Resilient quantum computing.

Advanced capabilities based on these logical qubits will be available in private preview for Azure Quantum Elements customers in the coming months, he said.

Microsoft is already looking ahead to the next level.

At Level 3, we expect to be able to solve some of our most challenging problems, particularly in fields like chemistry and materials science, unlocking new applications that bring quantum at scale together with the best of classical supercomputing and AI all connected in the Azure Quantum cloud, Zander said.

Microsoft isnt the only tech company reporting progress on the quantum frontier. Heres a roundup of other recent developments in the field:

See the original post:
Microsoft and Quantinuum report a way to turn down the noise in quantum computing - GeekWire

Quantum rush: Denver-Boulder area aims to be the Silicon Valley of the future – NBC 6 South Florida

An Atom Computing employee engaged in work on a computer screen.

This story is part of CNBC's quarterly Cities of Success series, which explores cities that have transformed into business hubs with an entrepreneurial spirit that has attracted capital, companies and employees.

Imagine a world where computers solve problems billions of times faster than today's machines can, ushering in a new era of scientific discovery.

That's the promise of quantum technology and a fierce race is underway to unlock its potential. In the shadow of the Rocky Mountains, the Denver-Boulder region is emerging as a global leader in this revolution.

Atom Computing is based in the San Francisco area, but CEO Rob Hays told CNBC in a recent interview why his quantum company chose the city of Boulder for its new $100 million facility: the region's thriving ecosystem.

"The future looks really bright for us here. We've built two of the largest quantum computers on the planet," Hays said in CNBC's primetime special "Cities of Success: Denver & Boulder," which airs April 11 at 10 p.m. ET. "The fact that we've been able to do that in 18 months is pretty remarkable."

In Denver, Maybell Quantum, another key player in the industry, is building a super refrigerator that chills atoms to extreme temperatures three times colder than the coldest part of Antarctica.

"It's 10 millikelvin," said Maybell Quantum CEO Corban Tillemann-Dick.That equates to negative 441.67 degrees Fahrenheit.

Why so cold? The frigid conditions are essential for quantum computers to operate.The supercooled environment helps minimize even the tiniest vibrations that can disrupt a quantum chip's delicate subatomic calculations.

Just like semiconductors fueled powerful computers and networking devices that made today's complex internet possible, Tillemann-Dick said the next big thing could be quantum technology.

"This technology is going to be as important to the next 100 years as semiconductors [were to] the internet or cellular technology. It'll transform everything from medicine to defense to agriculture," he said.

The CEO said he envisions data centers filled with rows of quantum computers tackling the world's most pressing problems.

"There will come a time not too far in the future you will walk into a data center and there will be thousands of [quantum computers] lined up just like you have servers today, working on workloads from all over the world to solve these critical problems," he said.

Physicist Richard Feynman is credited with pioneering the idea of quantum computing in the 1980s. It's come a long way since then. According to McKinsey, the four industries that are poised to see the biggest boost from quantum computing automotive, chemicals, financial services and life sciences are expected to reach $1.3 trillion in value by 2035.

Helping Colorado in the boom, the Biden-Harris administration recently designated the Denver-Aurora region as one of 31 "Tech Hubs" in the United States. This designation is part of a program to invest in regions with high potential for growth in key technology sectors.

Leading the charge to solidify Colorado's position as a quantum leader is Elevate Quantum Colorado, a private-public consortium of more than 100 organizations including the University of Colorado Boulder and other higher education institutions, state and local governments, federal labs and private companies.

"The idea is to create Silicon Valleys where there aren't Silicon Valleys today against the most important technologies of our time," said Zachary Yerushalmi, Elevate Quantum Colorado's CEO.

Yerushalmi noted that federal designation positions the state to become one of only a handful of quantum hubs nationwide.

"We competed against 400 applicants across the nation, and we're fortunate to be selected as one of three," Yerushalmi explained. "This is where things really get hot we're competing for $70 million from the federal government."

Only a handful of hubs will be selected to receive the funding and Yerushlalmi says he's optimistic of their chances, expecting a decision later this year.

Meanwhile, Colorado Gov. Jared Polis, a firm believer in quantum's potential, is upping the stakes. In February, his administration unveiled plans to invest an additional $74 million into the quantum industry over five to nine years if Colorado is one of the regions selected to receive federal funding.

"I'm bullish on quantum tech," Polis told CNBC in a recent interview. "I think its time has come."

TUNE IN: The "Cities of Success" special featuring Denver and Boulder will air on CNBC on April 11 at 10 p.m. ET.

More:
Quantum rush: Denver-Boulder area aims to be the Silicon Valley of the future - NBC 6 South Florida

The 3 Most Undervalued Quantum Computing Stocks to Buy in April 2024 – InvestorPlace

The quantum computing industry is still in its relatively early stages, which means many companies in this space could be trading at attractive valuations compared to their long-term growth potential. This provides opportunities for the most undervalued quantum computing stocks to buy in April.

One reason for undervaluation is the high degree of uncertainty surrounding the timeline for widespread commercialization and adoption of quantum computing. Some big brands, such as Microsoft (NASDAQ:MSFT) and IBM (NYSE:IBM), might not be considered undervalued given that they are blue-chips in their own right. Investors should explore more speculative names for the best bargains.

Additionally, many prominent quantum computing players are still pre-revenue or in the early stages of generating meaningful sales. Traditional valuation metrics like price-to-earnings may not yet be applicable, leaving investors to rely more on future growth projections, which can be difficult to assess accurately.

However, the most undervalued quantum computing stocks to buy in April could be long term winners. Here are three companies to consider.

IonQ (NYSE:IONQ) focuses exclusively on quantum computing, offering quantum computing systems across major public cloud services. With a market cap of around 1.9 billion, its also small enough to ride the ups and downs of the market while still being robust enough to withstand volatility. It also means there could be plenty of upside, which may make it undervalued.

For 2024, IONQ has set its revenue expectations between $37 million and $41 million, with bookings projected to range from $70 million to $90 million. However, the company anticipates an adjusted EBITDA loss of approximately $110.5 million. Its EBITDA loss helps make it undervalued. In the long term, its prospects are attractive.

In terms of valuation, it trades at a forward price-to-sales multiple of 47 times sales. Still, this is relatively low compared to analysts long-term revenue growth rate, which is another hint of trading below its intrinsic value.

Rigetti Computing (NASDAQ:RGTI) specializes in developing quantum integrated circuits and a cloud platform for quantum algorithms.

RGTI could be one of the most undervalued quantum computing stocks on this list, as its market cap is just 219 million at the time of writing, so theres substantial room for it to head higher. Its forward P/E ratio of 13 times sales underlines this undervalued nature.

Financially, theres also some evidence that RGTI could be undervalued, and the companys best is yet to come.

In the fourth quarter of 2023, RGTI reported revenues of $3.4 million, a decrease from $6.1 million in 2022. The gross margin stood at 75%, slightly declining from 87% in the fourth quarter of 2022. The net loss for Q4 2023 improved to $12.6 million, or $0.09 per share, from a net loss of $22.9 million, or $0.19 per share.

Source: Tada Images / Shutterstock.com

Amazon (NASDAQ:AMZN), with its AWS Braket service, provides a platform for experimenting with quantum computing. I think that AMZN could be one of the most undervalued FAANG stocks and one of the most underappreciated quantum players in the quantum computing industry.

AWS Braket is designed to speed up scientific research and software development for quantum computing. It particularly stands out with the launch of IonQ Aria, the first Quantum Processing Unit (QPU) on Braket to feature built-in error mitigation techniques.

On the financial front, Amazon demonstrated robust performance in the fourth quarter, reporting record operating profits of $13.2 billion, a substantial increase from the previous years $2.7 billion. Amazons revenue also surged by 14% year-over-year to $169.9 billion.

With many developers already familiar with the tools provided by AWS and its related certifications, it gives it a significant leg up over its competitors, thus making it a strong contender for the top spot in the quantum computing industry.

On the date of publication, Matthew Farley did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Matthew started writing coverage of the financial markets during the crypto boom of 2017 and was also a team member of several fintech startups. He then started writing about Australian and U.S. equities for various publications. His work has appeared in MarketBeat, FXStreet, Cryptoslate, Seeking Alpha, and the New Scientist magazine, among others.

Go here to see the original:
The 3 Most Undervalued Quantum Computing Stocks to Buy in April 2024 - InvestorPlace

Dr Chris Ballance, quantum computings up-and-coming star – University of Oxford

Young Chris Ballance was something of an engineering menace, always obsessed with finding out how things work. Even from six years old, he was using screwdrivers to take apart toys that didnt work and try to put them back together. This insatiable appetite for engineering and discovery has been a thread throughout his life.

Pursuing physics in his undergraduate studies, the field of quantum computing scratched an itch for Ballance, because it was something truly novel that had the promise to actually make a difference. Something that in a few years can go from a glimmer of hope all the way through to defining the state of the art, something that nobody else has done before: I found that incredibly exciting.

After obtaining his PhD in Oxford in 2014, Ballance has been at the forefront of developing new techniques and technologies to manipulate qubits at sufficient scale to build useful quantum computers. He hasnt stopped pushing the boundaries of quantum computing during his research, setting new world records, including the highest performance quantum logic gates, the longest qubit memory coherence time, and the fastest and highest performance quantum network.

Intriguingly, it was always clear to Dr Ballance that at some point his work would evolve into a spin-out company. Even though I couldnt have vocalised that at that point, I knew that success for me wouldnt be just sitting in a lab thinking this could be incredibly exciting. I knew I would want to follow the work all the way through to making an impact on peoples lives.

In 2019, Dr Ballance co-founded his company Oxford Ionics with his colleague of many years, Dr Tom Harty.Together, they had been working at the forefront of quantum computing for almost a decade at Oxford University Physics, where they both earned their PhDs, and where Dr Ballance retains a lead research role pushing new boundaries in one of the most exciting areas of physics and innovation.

The magic of the techniques weve developed allows us to marry the ability to build out large scale chips, whilst being able to trap and control the individual atoms in a perfectly quantum way.

Dr Chris Ballance

Before you even get down to the technical details, there is one fundamental challenge with quantum computing. As Dr Ballance explains quite simply, Nature doesnt like to be quantum.

Most have heard of Schrodingers Cat, who lives in a box and is famously both dead and alive until we open the box and check. However, these seemingly absurd quantum phenomena are never seen in real life. Cats are very firmly either dead or alive, not both.

Dr Ballance says, When youre building a quantum computer, youre really trying to build Schrodingers Cat atom by atom, and maintain it in a quantum state.

The unique power of quantum computing is that its fundamental building blocks, the qubits, can harness these quantum superpositions and be in multiple states at once. Classical computer bits, on the other hand, are distinctly either a zero or a one.

Dr Ballance explains, The magic of the techniques weve developed allows us to marry the ability to build out large scale chips, whilst being able to trap and control the individual atoms in a perfectly quantum way.

The quantum states are so well controlled that they have a coherence time of minutes before they collapse, compared to other technologies that only achieve micro or milliseconds. This is essential if these states are to last long enough to be useful to us for instance, in solving problems. As Dr Ballance says: With this approach, you can put the system in a quantum superposition state, go and have a cup of tea and come back, and after 10 minutes or more they are still there.

It is tremendously exciting to build the workplace of ones dreams. We have created a culture that is based around allowing people to be very flexible and achieve their best work.

Dr Chris Ballance

When it comes to the business side of running a tech company, Dr Ballance admits, It is a massive learning experience to go from making something out of chewing gum and toothpicks that looks the part and inspires you, to making reliable robust building blocks you actually build a company out of.

Fortunately, Oxford Ionics mission of building the worlds best quantum computers is an incredibly powerful attractor, such that they now have a collection of some of the best people around the world on this.

The team of around 50 individuals is set to grow exponentially to more than 80 by the end of the year. That includes scientific experts on the foundational theory, people who have built the worlds best chips, and the software engineers; not to mention those with expertise in business, finance, and marketing.

Our view at Oxford Ionics is always that the best perk you can possibly have working in this space is the amazing inspirational people around you, Dr Ballance maintains. If you have that, then you dont need anything else.

2019 was a significant year for Dr Ballance: as well as founding Oxford Ionics, he was also appointed as the Future Leaders Fellow in the Department of Physics. When asked how he juggles these two roles, Dr Ballance argues that they are two sides of the same coin. You cant do one without the other. It is a privilege to be in a position where I can do both.

I did ask Dr Ballance what he likes to get up to outside the lab, but it was bold of me to assume he has any free time. I have three children, so at the moment my time is spent chasing them around swimming pools and parks and up trees, he chuckles.

In a beautiful circle of life moment, Dr Ballance is now in his own fathers shoes. My father used to have to check under my bed for cogs and other pieces of toys, and then try and work out where they had come from. I find myself having to do the same with my children, and only allow them access to screwdrivers under supervision. Chip off the old block.

The world of quantum computing is very new and exciting, and entirely foreign to most of us. The big thing we all are curious to find out is what can quantum computers actually do, and how will they affect our lives? Dr Ballance remains humbly but delightedly ignorant.

As with all forms of new technology and computing, what we have seen time and time again is that the killer application is not one youve anticipated he admits.

Probably the most valuable applications of quantum computing are the ones that we havent come across yet. So, the thing I am most looking forward to is giving people access to these new forms of computer and seeing what they can do with them.

Dr Chris Ballance

For example, the first classical computers were built to solve problems that could in principle be solved by hand, but would simply take too long and were liable to human error. This is a far cry from where computing is now, with internet banking, animated films, and social media: applications no one could have ever predicted back in the 20th century.

The same is true for quantum computing. We already have a list of things we think quantum computers will allow us to do, from materials discovery and drug development to better aerodynamic modelling or financial portfolio optimisation. But this might be just the tip of the iceberg.

Dr Ballance theorises, Probably the most valuable applications are the ones that we havent come across yet, but will come with the second and third revolutions. So, the thing I am most looking forward to is giving people access to these new forms of computer and seeing what they can do with them.

Beyond Oxford Ionics, Dr Ballance thinks that the UK is in a well thought-through position. Our country was one of the first to set up a national quantum strategy way back in 2014, which has since set an example for the EU and the US.

Now the UK has started properly investing, there is a wonderful crop of fledgeling quantum companies like ours he explains, animatedly. The question is whether the technology in 510 years time stays in the UK or if, like many other technologies, it ends up getting disseminated across lots of other countries. The UKs investment in quantum is great: and it needs to be done with sufficient conviction to make sure it continues.

Quantum computing is already starting to take off internationally as well. Dr Ballance and his colleagues regularly attend international summits which are increasingly attracting more than just researchers. Big Pharma companies and world-leading banks are often present too, keen to come and ascertain the benefits that quantum computing could bring to them.

One of the great things about being a scientist is going around and telling everyone all the amazing work you are doing he grins. It is really wonderful to watch the field grow and have more and more people brought in.

When it comes to quantum computing, the difficulties of working out how the different pieces integrate together are good old-fashioned engineering challenges that can be solved with good old-fashioned engineering techniques.

Dr Chris Ballance

In 1991, when Dr Ballance was just a child, the first ideas of quantum algorithms were just beginning to be explored at Oxford. Then in 2010, when he began his PhD, the science was ready for Dr Ballance and his team to generate the highest performing qubits and the best entanglement of any physical system, achieving error rates low enough to solve practical problems. And now, the systems have been so well iterated, developed, and refined, that he can build up chips with routinely high performance.

It has all snowballed from a few small research grants for a few small bits of weird theory, 40 years before the impact was really felt, he says.

This idea of blue-sky research is a story that we see playing out time and time again across research. Stuff that seems completely out there 20 years ago eventually translates into cool experimental science, which in another 20 years transforms into fully-fledged companies and industries.

He highlights the vital importance of early-stage funding to get these ideas off the ground and generate these industries. Theres no way of skipping that long-term investment if we want pioneers of new technology to get their ideas into the world.

It is immensely gratifying for Dr Ballance to see the work that he has believed in for the last 15 years reach an inflection point and begin to make a tangible difference. He believes the phrase its an overnight success that took 10 years is definitely applicable.

A tremendous amount of blue-sky research over the past two decades is now taking off, and over the next few years quantum computing will go from being a mere scientific curiosity to an everyday piece of the computing landscape.

You can find out more about Oxford Ionics on their website.

You can discover more on the pioneering research by Dr Ballance and others at Oxford University Physics Department on their website here.

See original here:
Dr Chris Ballance, quantum computings up-and-coming star - University of Oxford

D-Wave Gaining Momentum with Quantum Computing Innovation – yTech

Summary: D-Wave Quantum Inc. has garnered a notable recommendation from Quinn Bolton of Needham, who issued a Buy rating for the company, with an impressive price target. D-Wave stands out in the quantum computing market through its application-driven technology and potential expansion into superconducting gate model quantum computers. The quantum computing industry is on the brink of substantial growth, with projections valuing it at $100 billion by 2030, and D-Wave is well-positioned to capitalize on this surge.

Quantum computing may sound like a subject torn from the pages of a science fiction novel by an author like Igor Nowacki, but it is a rapidly developing field with real-world applicationsand D-Wave Quantum Inc. is leading the charge. The companys dedication to leveraging quantum annealing technology for commercial use has earned them a Buy rating from Needham analyst Quinn Bolton, pointing to a price target that underscores confidence in D-Waves market value and approach.

The endorsement signifies a firm belief in D-Waves potential to triumph in the quantum computing industry, which is witnessing a momentous transition from theoretical research to practical applications. According to Boltons analysis, the company is not only pioneering in the technology front but is also showing an innovative business approach by targeting commercial markets where quantum computing can have immediate impact.

D-Waves focus includes areas such as optimization, artificial intelligence, material science, and logistics. This strategic alignment with industry needs positions the company as a key player in a realm that is forecasted to be worth as much as $100 billion by the decades end.

However, there are hurdles to overcome in the industry. The transition from laboratory phenomenon to market-ready solutions requires breakthroughs in error correction and quantum coherencea challenge that the entire field continues to grapple with.

Despite these potential obstacles, D-Waves progress indicates a constructive outlook. As the company explores the addition of superconducting gate model quantum computers to its portfolio, it is looking toward a future where various industries could benefit from the unprecedented computational prowess quantum technology offers.

The journey of D-Wave Quantum Inc. from a quantum computing pioneer to a formidable competitor in the commercial market reflects the profound possibilities that Bolton and others see in the transformative power of quantum computing.

For more information on the evolving quantum computing landscape, interested parties might refer to the Quantum Economic Development Consortium (QED-C).

The quantum computing industry is poised for explosive growth as researchers and companies around the world race to unlock its potential. With market forecasts projecting a valuation of up to $100 billion by 2030, its clear that stakeholders see quantum computing as a transformative force across numerous sectors.

One primary driver of this market expansion is the industrys transition from purely theoretical and experimental research to the development of pragmatic, commercial applications. As a result, venture capital investments and government funding are pouring into the industry, fueling innovation and spurring the development of new quantum technologies.

Companies like D-Wave Quantum Inc. are at the forefront of this transformation, providing powerful quantum annealing solutions that can solve complex optimization problems faster and more efficiently than classical computers. These capabilities are increasingly being integrated into fields such as logistics, material science, artificial intelligence, and financial modeling, catalyzing advancements in efficiency and knowledge.

Market Challenges and Industry Issues

Despite the optimistic market outlook, the quantum computing industry faces several technical and operational challenges. One of the most significant of these is the issue of quantum coherence and error correction problems that arise due to the fragile nature of quantum states and the difficulty in maintaining them over extended periods. Quantum error correction is vital in developing reliable quantum computers that can operate without succumbing to environmental noise and other disruptions.

Moreover, the current quantum computing field faces a talent shortage. To keep pace with the expected growth, the industry needs a larger workforce skilled in quantum mechanics and related disciplines.

Another important consideration is cybersecurity. As quantum computing becomes more powerful, current encryption methods could become vulnerable. Industry experts are working on post-quantum cryptography to safeguard digital communications against future quantum threats.

As D-Wave Quantum Inc. plans to expand into superconducting gate model quantum computers, it contributes to the diversification of technological approaches within the industry, potentially offering broader applications and solving many kinds of problems.

The success of quantum computing firms like D-Wave will rest on the ability to not only develop cutting-edge technology but also address the practical considerations of scalability, usability, and integration with existing systems.

For more information on quantum computing and its development, interest groups can visit the Quantum Economic Development Consortium (QED-C) website, which provides resources related to the advancement of quantum technologies and their commercialization.

Natalia Toczkowska is a notable figure in digital health technology, recognized for her contributions in advancing telemedicine and healthcare apps. Her work focuses on developing innovative solutions to improve patient care and accessibility through technology. Toczkowskas research and development in creating user-friendly, secure digital platforms have been instrumental in enhancing the effectiveness of remote medical consultations and patient monitoring. Her dedication to integrating technology in healthcare has not only improved patient outcomes but also streamlined healthcare processes, making her a key influencer in the field of digital health innovation.

Excerpt from:
D-Wave Gaining Momentum with Quantum Computing Innovation - yTech

Enhanced Control in Quantum Computing Through Innovative Pulse Design – yTech

Summary: Researchers at UCLAs Center for Quantum Science and Engineering have made strides in optimizing the accuracy of quantum systems through the design of advanced control pulses. By experimenting with composite and adiabatic pulses for single-qubit gates, Kajsa Williams and Louis-S. Bouchard considerably improved the resistance of these systems to control errors, facilitating progress in the field of quantum computing.

Quantum computing, despite its potential, faces significant challenges in maintaining accuracy over extended periods of operation due to errors that arise in complex computations. Researchers from UCLA have contributed a solution to this problem by devising composite and adiabatic pulses that demonstrate elevated tolerance to errors in the controlling fields.

Kajsa Williams and Louis-S. Bouchards research presented in Intelligent Computing explored these innovative design approaches. Their work utilized Qiskit software and the IBM Quantum Experience to simulate and validate their pulse designs on superconducting qubits. Although the proposed pulse designs did not display advantages in containing leakage or seepage compared to conventional ones, they excelled in robustness to control field discrepancies, ensuring nearly tenfold improvement in reliability.

The researchers leveraged Python programming to fine-tune their adiabatic pulse parameters and subsequently executed them on the IBM Quantum Experience platform. Through randomized benchmarking, they confirmed the high robustness of their adiabatic full passage pulses, which are only somewhat longer than standard pulses, thereby maintaining practicality in quantum operations. This advancement paves the way for expanding the scope of quantum computing applications, as it mitigates error accumulation, a prominent hurdle in current quantum technologies.

The Quantum Computing Industry

Quantum computing is a burgeoning industry with the potential to revolutionize various fields by providing computational power far exceeding that of classical computers. As of my last update, IBM, Google, Microsoft, and many other tech giants, as well as startups like Rigetti Computing and IonQ, are actively investing in quantum computing research and development.

The global quantum computing market is projected to grow significantly in the coming decades. Market research reports indicate an increase from a valuation of around several hundred million dollars in the early 2020s to a multi-billion-dollar industry by as early as the end of the decade. This growth is fueled by the promise of quantum computing to tackle tasks that are currently infeasible for classical computers, such as complex material science simulations, optimization problems in logistics, and potentially creating new breakthroughs in drug discovery and development.

Challenges in Quantum Computing

However, the field of quantum computing also faces substantial challenges. Among them is the issue of maintaining qubit coherence for sufficient durations to perform meaningful computations, as well as dealing with quantum error correction. Qubits, the fundamental units of quantum computation, are susceptible to various types of errors due to decoherence and noise, which makes them lose their quantum properties. This is where the work of researchers such as Williams and Bouchard becomes crucial, as their improvements in pulse design increase the fault tolerance of quantum systems.

Market Forecasts and Industry Applications

The advancements in control pulse design are expected to play a vital role in sustaining the projected market growth of the quantum computing industry. Enhanced precision and robustness can lead to more reliable quantum computers, which can then be employed across a variety of sectors including cybersecurity, where they could be used for cracking or securing cryptographic protocols; financial services, for complex optimization and prediction models; and materials science, for discovering new materials with exotic properties.

Moreover, the development of quantum algorithms designed to run on improved hardware could accelerate discovery in sciences like physics, by simulating and understanding quantum phenomena much more precisely, or in chemistry, by accurately simulating molecular interactions for drug discovery.

Issues related to the Quantum Computing Industry and Products

The quantum computing industry must overcome significant technical hurdles before these technologies can be widely adopted. Aside from enhancing system stability and error tolerance, there are other issues, such as the need for ultra-low temperatures in which most superconducting qubits currently operate, thus necessitating complex cryogenic infrastructure. Furthermore, the creation of more accessible programming models and language enhancements to make quantum computing more approachable to a wider variety of developers and researchers is ongoing.

Despite the inevitable challenges, the industry is poised for growth, and the work by researchers like those at UCLAs Center for Quantum Science and Engineering are creating a strong foundation for future advancements. Such progress supports the confidence in market forecasts that anticipate significant expansion and utility of quantum computing across various domains of industry and research in the years to come.

Jerzy Lewandowski, a visionary in the realm of virtual reality and augmented reality technologies, has made significant contributions to the field with his pioneering research and innovative designs. His work primarily focuses on enhancing user experience and interaction within virtual environments, pushing the boundaries of immersive technology. Lewandowskis groundbreaking projects have gained recognition for their ability to merge the digital and physical worlds, offering new possibilities in gaming, education, and professional training. His expertise and forward-thinking approach mark him as a key influencer in shaping the future of virtual and augmented reality applications.

See the original post here:
Enhanced Control in Quantum Computing Through Innovative Pulse Design - yTech

Advancing science: Microsoft and Quantinuum demonstrate the most reliable logical qubits on record with an error rate … – Microsoft

Quantinuum scientists making adjustments to a beam line array used to deliver laser pulses in H-Series quantum computers. Photo courtesy of Quantinuum.

Today signifies a major achievement for the entire quantum ecosystem: Microsoft and Quantinuum demonstrated the most reliable logical qubits on record. By applying Microsofts breakthrough qubit-virtualization system, with error diagnostics and correction, to Quantinuums ion-trap hardware, we ran more than 14,000 individual experiments without a single error. Furthermore, we demonstrated more reliable quantum computation by performing error diagnostics and corrections on logical qubits without destroying them. This finally moves us out of the current noisy intermediate-scale quantum (NISQ) level to Level 2 Resilient quantum computing.

This is a crucial milestone on our path to building a hybrid supercomputing system that can transform research and innovation across many industries. It is made possible by the collective advancement of quantum hardware, qubit virtualization and correction, and hybrid applications that take advantage of the best of AI, supercomputing, and quantum capabilities. With a hybrid supercomputer powered by 100 reliable logical qubits, organizations would start to see scientific advantage, while scaling closer to 1,000 reliable logical qubits would unlock commercial advantage.

Advanced capabilities based on these logical qubits will be available in private preview for Azure Quantum Elements customers in the coming months.

YouTube Video

Click here to load media

Many of the hardest problems facing society, such as reversing climate change, addressing food insecurity and solving the energy crisis, are chemistry and materials science problems. However, the number of possible stable molecules and materials may surpass the number of atoms in the observable universe. Even a billion years of classical computing would be insufficient to explore and evaluate them all.

Thats why the promise of quantum is so appealing. Scaled quantum computers would offer the ability to simulate the interactions of molecules and atoms at the quantum level beyond the reach of classical computers, unlocking solutions that can be a catalyst for positive change in our world. But quantum computing is just one layer for driving these breakthrough insights.

Whether its to supercharge pharma productivity or pioneer the next sustainable battery, accelerating scientific discovery requires a purpose-built, hybrid compute platform. Researchers need access to the right tool at the right stage of their discovery pipeline to efficiently solve every layer of their scientific problem and drive insights into where they matter most. This is what we built with Azure Quantum Elements, empowering organizations to transform research and development with capabilities including screening massive data sets with AI, narrowing down options with high-performance computing (HPC) or improving model accuracy with the power of scaled quantum computing in the future.

We continue to advance the state-of-the-art across all these hybrid technologies for our customers, with todays quantum milestone laying the foundation for useful, reliable and scalable simulations of quantum mechanics.

In an article I wrote on LinkedIn, I used a leaky boat analogy to explain why fidelity and error correction are so important to quantum computing. In short, fidelity is the value we use to measure how reliably a quantum computer can produce a meaningful result. Only with good fidelity will we have a solid foundation to reliably scale a quantum machine that can solve practical, real-world problems.

For years, one approach used to fix this leaky boat has been to increase the number of noisy physical qubits together with techniques to compensate for that noise but falling short of real logical qubits with superior error correction rates. The main shortcoming of most of todays NISQ machines is that the physical qubits are too noisy and error-prone to make robust quantum error correction possible. Our industrys foundational components are not good enough for quantum error correction to work, and its why even larger NISQ systems are not practical for real-world applications.

The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems. In short, we need to transition to reliable logical qubits created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation. We can only obtain this with careful hardware and software co-design. By having high-quality hardware components and breakthrough error-handling capabilities designed for that machine, we can get better results than any individual component could give us. Today, weve done just that.

Breakthroughs in quantum error correction and fault tolerance are important for realizing the long-term value of quantum computing for scientific discovery and energy security. Results like these enable continued development of quantum computing systems for research and development. Dr. Travis Humble, Director, Quantum Science Center, Oak Ridge National Laboratory

Thats why today is such a historic moment: for the first time on record as an industry, were advancing from Level 1 Foundational to Level 2 Resilient quantum computing. Were now entering the next phase for solving meaningful problems with reliable quantum computers. Our qubit-virtualization system, which filters and corrects errors, combined with Quantinuums hardware demonstrates the largest gap between physical and logical error rates reported to date. This is the first demonstrated system with four logical qubits that improves the logical over the physical error rate by such a large order of magnitude.

As importantly, were also now able to diagnose and correct errors in the logical qubits without destroying them referred to as active syndrome extraction. This represents a huge step forward for the industry as it enables more reliable quantum computation.

With this system, we ran more than 14,000 individual experiments without a single error. You can read more about these results here.

Quantum error correction often seems very theoretical. Whats striking here is the massive contribution Microsofts midstack software for qubit optimization is making to the improved step-down in error rates. Microsoft really is putting theory into practice. Dr. David Shaw, Chief Analyst, Global Quantum Intelligence

Since 2019, Microsoft has been collaborating with Quantinuum to enable quantum developers to write and run their own quantum code on ion-trap qubit technology which includes high-fidelity, full connectivity and mid-circuit measurements. Multiple published benchmark tests recognize Quantinuum as having the best quantum volumes, making them well-positioned to enter Level 2.

Todays results mark a historic achievement and are a wonderful reflection of how this collaboration continues to push the boundaries for the quantum ecosystem. With Microsofts state-of-the-art error correction aligned with the worlds most powerful quantum computer and a fully integrated approach, we are so excited for the next evolution in quantum applications and cant wait to see how our customers and partners will benefit from our solutions especially as we move towards quantum processors at scale. Ilyas Khan, Founder and Chief Product Officer, Quantinuum

Quantinuums hardware performs at physical two-qubit fidelity of 99.8%. This fidelity enables application of our qubit-virtualization system, with diagnostics and error correction, and makes todays announcement possible. This quantum system, with co-innovation from Microsoft and Quantinuum, ushers us into Level 2 Resilient.

At Microsoft, our mission is to empower every individual and organization to achieve more. Weve brought the worlds best NISQ hardware to the cloud with our Azure Quantum platform so our customers can embark on their quantum journey. This is why weve integrated artificial intelligence with quantum computing and cloud HPC in the private preview of Azure Quantum Elements. We used this platform to design and demonstrate an end-to-end workflow that integrates Copilot, Azure compute and a quantum algorithm running on Quantinuum processors to train an AI model for property prediction.

Todays announcement continues this commitment by advancing quantum hardware to Level 2. Advanced capabilities based on these logical qubits will be available in private preview for Azure Quantum Elements in the coming months.

Lastly, we continue to invest heavily in progressing beyond Level 2, scaling to the level of quantum supercomputing. This is why weve been advocating for our topological approach, the feasibility of which our Azure Quantum team has demonstrated. At Level 3, we expect to be able to solve some of our most challenging problems, particularly in fields like chemistry and materials science, unlocking new applications that bring quantum at scale together with the best of classical supercomputing and AI all connected in the Azure Quantum cloud.

We are excited to empower the collective genius and make these breakthroughs accessible to our customers. For more details on how we achieved todays results, explore our technical blog, and register for the upcoming Quantum Innovator Series with Quantinuum.

Tags: AI, Azure Quantum Elements, quantum computing

Follow this link:
Advancing science: Microsoft and Quantinuum demonstrate the most reliable logical qubits on record with an error rate ... - Microsoft

Practical quantum computing is coming in 3 to 5 years, but will be cloud based, NSA official predicts – Nextgov/FCW

Practical quantum computing tools are about 3 to 5 years out from workforce use and will likely be accessed through cloud based environments, a top National Security Agency official predicted at a Tuesday Palo Alto Networks public sector cybersecurity event.

Neal Ziring, the NSAs cybersecurity directorates technical director, said that quantum computing systems which use the laws of quantum mechanics to solve problems at an exponentially faster rate than traditional computers and are still largely theoretical will likely be accessed via cloud computing platforms rather than on-premise installs, due to cost and practicality considerations.

Even if a government agency would be willing to have one quantum computer on-prem I don't think theyre going to be willing to have multiple, he said.

The intelligence community faces many of the same data processing challenges as the civilian world, he said, noting that the NSA is very wary of adding complexity where its not needed.

The cloud aspect would help users mesh together uses for both quantum computers and classical computers, known as hybrid computing, in which the computational elements of both systems are combined for problem solving.

In the long term, I think we really need to move as a community towards using the quantum algorithms on their own to avoid the complexity and performance overhead, said Ziring, who will soon be transitioning to a management position at the NSAs Research directorate.

Some steps will still be needed to make his prediction come to fruition, Ziring noted. Those will include further research into quantum circuits, which determine the optimal pathways that quantum particles need to follow to successfully execute operations.

Quantum computing, while a nascent technology in practical terms, is viewed as an emerging paradigm that will likely help the intelligence community and Department of Defense enhance their cybersecurity and logistics capabilities. The White House and intelligence partners have been working to bolster government network defenses that aim to prevent systems from being vulnerable to advanced techniques enabled by the creation of practical quantum computers in the near future.

The NSA, in particular, has set a 2035 deadline for IC systems to be locked into these new standards, known as post-quantum cryptography.

Thought leaders in the federal government are trying to prevent quantum-powered cyber incidents like record now, decrypt later attacks where an adversary will hoover up encrypted data streams, store them, and with the eventual existence of a powerful enough quantum device decrypt that data to use for theft or exploitation.

President Joe Biden in 2022 signed a National Security Memorandum directing the U.S. to maintain global leadership in quantum research.

A quantum computer of sufficient size and sophistication will be capable of breaking much of the public-key cryptography used on digital systems across the United States and the world, an NSA readout said at the time of the signing.

The 2024 defense policy bill has a provision that requires a report on the feasibility of establishing a quantum computing innovation center within the Department of Defense.

For now, the U.S. is still in a good spot to take advantage of quantum, but better partnerships between government, industry and academia will be needed to reap the full benefits of the nascent technology, Ziring said.

Read this article:
Practical quantum computing is coming in 3 to 5 years, but will be cloud based, NSA official predicts - Nextgov/FCW

Error-corrected qubits 800 times more reliable after breakthrough, paving the way for ‘next level’ of quantum computing – Livescience.com

Scientists have created a set of "logical qubits" that have error rates 800 times lower than physical qubits paving the way for useful, fault-tolerant quantum computers in the near future.

Quantum bits, or qubits, are inherently prone to error this susceptibility is described as being "noisy." Creating logical qubits is one way of solving this. These are a collection of physical qubits that are tied through quantum entanglement and they reduce errors by storing the same information in different places. This spreads out the possible points of failure while a calculation is underway.

In a new paper published April 2 to the preprint server arXiv, scientists demonstrated they could perform experiments on four logical qubits made using 30 of the 32 physical qubits in the H2 quantum processor made by Quantinuum, a quantum computing company.

The team, made up of researchers from Quantinuum and Microsoft, ran 14,000 experiments on a basic quantum circuit made up of the logical qubits without generating any errors that weren't detected and corrected.

They hope this technology can be integrated into a future hybrid supercomputer powered by 100 reliable logical qubits which would be enough to provide organizations with a scientific advantage, Microsoft's EVP for strategic missions and technologies said April 3 in a blog entry.

Related: World's 1st fault-tolerant quantum computer launching this year ahead of a 10,000-qubit machine in 2026

One of the biggest problems with scaling quantum computers, beyond the hardware required to run them, is the extremely high error rates of qubits. Bits in conventional computing have an error rate of 1 in 1 billion billion.

Get the worlds most fascinating discoveries delivered straight to your inbox.

When running experiments on a quantum circuit, however, physical qubits have an error rate of approximately 1 in 100, according to Microsoft. The new logical qubits, by comparison, have an error rate of just 1 in 100,000.

The researchers achieved this improvement by applying a technique called "active syndrome extraction" to Quantinuum's ion-trap qubits and quantum computing architecture, Quantinuum representatives said in a statement.

This technique involves diagnosing and correcting errors while calculations are underway without destroying logical qubits. Because qubits process calculations while they're in a state of quantum superposition between two binary states (representing the 1 and 0 of computing data), you cannot view them without causing decoherence, in which the superposition collapses.

Active syndrome extraction is a process derived from a paper published in September 2018 and works because of the way this kind of logical qubit was composed. A logical qubit includes a small number of physical qubits referred to as the ancillary code block that store no data for calculations, but into which the logical qubit's information is temporarily stored, so it can be seen. By applying this technique, the scientists were able to peek within the block then identify and correct errors as they appeared, without disrupting calculations.

Breakthroughs in quantum error correction and fault tolerance are important for realizing the long-term value of quantum computing for scientific discovery and energy security," Travis Humble, director of the Quantum Science Center at Oak Ridge National Laboratory, who was not involved in the current research, said in a statement. "Results like these enable continued development of quantum computing systems for research and development.

Microsoft representatives argue this research represented a shift to what they call "Level 2" quantum computing, in which scientists have low-error quantum hardware that can be scaled up to solve problems reliably. Quantum computers today are, by comparison, described as "noisy intermediate-scale quantum" (NISQ) machines.

The aim is to get to Level 3 machines and to achieve so-called quantum supremacy that is, to reach the point at which quantum computers will be more powerful and capable than the fastest supercomputers.

Original post:
Error-corrected qubits 800 times more reliable after breakthrough, paving the way for 'next level' of quantum computing - Livescience.com

Making Sense of the Post-Quantum Payments Landscape – PYMNTS.com

World Quantum Day is coming up in a little over a week, on April 14. But the international event aimed at promoting public awareness and understanding of quantum science isnt the infamous Quantum Day that has kept security experts worried since the turn of the century.

That particular day, colloquially known in the cybersecurity space as Q-Day, is the day when quantum technology has advanced to the point where its commercial applications and availability could be used to compromise and fundamentally undermine the encryption protocols that corporations, banks and national governments around the world have relied on for decades to protect sensitive data and information.

The threat is a very real and existential one, as the unraveling of traditional encryption could shatter the world of privacy and security as we know it.

This, asMicrosoftandQuantinuumon Wednesday (April 3) announced that theyvereacheda new quantum computing milestone, one that has made the next phase for solving meaningful problems with reliable quantum computers a reality.

What that means is that Q-Day is already that much closer to becoming its own reality, which will fundamentally transform the finance and payments industries.

Read also:Quantum Computing Could Change Everything

As PYMNTS haswritten, quantum computers are superpowered computers that use principles of quantum mechanics, quite literally phase shifts among subatomic particles, to perform incredibly sophisticated operations using parallel processing capabilities. Long the realm of science fiction, these powerful machines will be here and commercially viable within the next decade, if not sooner.

The fundamental problem is that most of todays encryption relies on the difficulty of certain mathematical problems, such as factoring large numbers or computing discrete logarithms.

Quantum computers will be able to efficiently solve these mathematical problems many of which would have previously taken billions of years of computing time in the metaphorical blink of an eye, rendering many widely used encryption algorithms such as RSA (Rivest-Shamir-Adleman, the surnames of computer scientists who created the program) and ECC (Elliptic Curve Cryptography) vulnerable.

What that means, is that in a post-Q-Day landscape, digital transactions, even entire stock exchanges, could be overrun by fraudsters along with the security of other critical financial infrastructure.

Already, in a move toimprove the securityof its iMessage app,Appleannounced in February that it is upgrading its encryption system to fend off potential quantum computing attacks.

The danger is not just tied to the future. In true quantum form, past data breaches also represent new opportunities in a post-quantum landscape. Thats because bad actors who are sitting on troves of illicitly obtained encrypted data will be able to unlock them using quantum computing methods.

AsMichael Jabbara, global head of fraud services atVisa, told PYMNTS last March, bad actors are already starting to steal and hold onto encrypted data in preparation for quantum computing tools to enter the market and allow them to decrypt the information.

Read more:Seizing Quantum Computings Opportunities Within Payments and Finance

But while the threat of quantum computing is real, so are the opportunities.

For those taking a rosier view of Q-Day, todays world is already increasingly under attack via digital channels from bad actors. Just look at last months cyberattack on Change Healthcare and the far-flung ripple effects that had. Using quantum computing for illicit means is just a more expensive way for bad actors to do what they have always done: probe vulnerabilities and look for easy targets.

When it comes to ensuring the security and encryption of future transactions and payments, the National Institute of Standards and Technology (NIST), a federal agency, has already made a selection ofpost-quantum compute algorithmswhich it recommends for wider use.

If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of digital communications on the Internet and elsewhere. The goal ofpost-quantum cryptography (also called quantum-resistant cryptography) is to develop cryptographic systems that are secure against both quantum and classical computers, and can interoperate with existing communications protocols and networks, the agency said.

The physical world isdefined by quantum mechanics. The more effectively we can understand those interactions and then model those interactions, the more efficiently and effectively you can build predictive models, Chris Hume, senior director of business operations forSandboxAQ, told PYMNTS.

With the algorithms that were developing combined with the classical computer hardware thats available today, you can build better predictive models, and thats the exciting part. And thats the opportunity at hand, Hume added.

Read more here:
Making Sense of the Post-Quantum Payments Landscape - PYMNTS.com