At its Quantum Summit 2023, IBM took the stage with an interesting spirit: one of almost awe at having things go their way. But the quantum of today the one thats changing IBMs roadmap so deeply on the back of breakthrough upon breakthrough was hard enough to consolidate. As IBM sees it, the future of quantum computing will hardly be more permissive. IBM announced cutting-edge devices at the event, including the 133-qubit Heron Quantum Processing Unit (QPU), which is the company's first utility-scale quantum processor, and the self-contained Quantum System Two, a quantum-specific supercomputing architecture. And further improvements to the cutting-edge devices are ultimately required.
Each breakthrough that afterward becomes obsolete is another accelerating bump against what we might call quantum's "plateau of understanding." Weve already crested this plateau with semiconductors, so much so that the latest CPUs and GPUs are reaching practical, fundamental design limits where quantum effects start ruining our math. Conquering the plateau means that utility and understanding are now enough for research and development to be somewhat self-sustainable at least for a Moores-law-esque while.
IBMs Quantum Summit serves as a bookend of sorts for the companys cultural and operational execution, and its 2023 edition showcased an energized company that feels like it's opening up the doors towards a "quantum-centric supercomputing era." That vision is built on the company's new Quantum Processing Unit, Heron, which showcases scalable quantum utility at a 133-qubit count and already offers things beyond what any feasible classical system could ever do. Breakthroughs and a revised understanding of its own roadmap have led IBM to present its quantum vision in two different roadmaps, prioritizing scalability in tandem with useful, minimum-quality rather than monolithic, hard-to-validate, high-complexity products.
IBM's announced new plateau for quantum computing packs in two particular breakthroughs that occurred in 2023. One breakthrough relates to a groundbreaking noise-reduction algorithm (Zero Noise Extrapolation, or ZNE) which we covered back in July basically a system through which you can compensate for noise. For instance, if you know a pitcher tends to throw more to the left, you can compensate for that up to a point. There will always be a moment where you correct too much or cede ground towards other disruptions (such as the opponent exploring the overexposed right side of the court). This is where the concept of qubit quality comes into account the more quality your qubits, the more predictable both their results and their disruptions and the better you know their operational constraints then all the more useful work you can extract from it.
The other breakthrough relates to an algorithmic improvement of epic proportions and was first pushed to Arxiv on August 15th, 2023. Titled High-threshold and low-overhead fault-tolerant quantum memory, the paper showcases algorithmic ways to reduce qubit needs for certain quantum calculations by a factor of ten. When what used to cost 1,000 qubits and a complex logic gate architecture sees a tenfold cost reduction, its likely youd prefer to end up with 133-qubit-sized chips chips that crush problems previously meant for 1,000 qubit machines.
Enter IBMs Heron Quantum Processing Unit (QPU) and the era of useful, quantum-centric supercomputing.
Image 1 of 2
The two-part breakthroughs of error correction (through the ZNE technique) and algorithmic performance (alongside qubit gate architecture improvements) allow IBM to now consider reaching 1 billion operationally useful quantum gates by 2033. It just so happens that its an amazing coincidence (one born of research effort and human ingenuity) that we only need to keep 133 qubits relatively happy within their own environment for us to extract useful quantum computing from them computing that we wouldnt classically be able to get anywhere else.
The Development and Innovation roadmap showcase how IBM is thinking about its superconducting qubits: as weve learned to do with semiconductors already, mapping out the hardware-level improvements alongside the scalability-level ones. Because as weve seen through our supercomputing efforts, theres no such thing as a truly monolithic approach: every piece of supercomputing is (necessarily) efficiently distributed across thousands of individual accelerators. Your CPU performs better by knitting and orchestrating several different cores, registers, and execution units. Even Cerebras Wafer Scale Engine scales further outside its wafer-level computing unit. No accelerator so far no unit of computation - has proven powerful enough that we dont need to unlock more of its power by increasing its area or computing density. Our brains and learning ability seem to provide us with the only known exception.
IBMs modular approach and its focus on introducing more robust intra-QPU and inter-QPU communication for this years Heron shows its aware of the rope it's walking between quality and scalability. The thousands of hardware and scientist hours behind developing the tunable couplers that are one of the signature Heron design elements that allow parallel execution across different QPUs is another. Pushing one lever harder means other systems have to be able to keep up; IBM also plans on steadily improving its internal and external coupling technology (already developed with scalability in mind for Heron) throughout further iterations, such as Flamingos planned four versions which still only end scaling up to 156 qubits per QPU.
Considering how you're solving scalability problems and the qubit quality x density x ease of testing equation, the ticks - the density increases that don't sacrifice quality and are feasible from a testing and productization standpoint - may be harder to unlock. But if one side of development is scalability, the other relates to the quality of whatever youre actually scaling in this case, IBMs superconducting qubits themselves. Heron itself saw a substantial rearrangement of its internal qubit architecture to improve gate design, accessibility, and quantum processing volumes not unlike an Intel tock. The planned iterative improvements to Flamingo's design seem to confirm this.
Theres a sweet spot for the quantum computing algorithms of today: it seems that algorithms that fit roughly around a 60-gate depth are complex enough to allow for useful quantum computing. Perhaps thinking about Intels NetBurst architecture with its Pentium 4 CPUs is appropriate here: too deep an instruction pipeline is counterproductive, after a point. Branch mispredictions are terrible across computing, be it classical or quantum. And quantum computing as we still currently have it in our Noisy Intermediate-Scale Quantum (NISQ)-era is more vulnerable to a more varied disturbance field than semiconductors (there are world overclocking records where we chill our processors to sub-zero temperatures and pump them with above-standard volts, after all). But perhaps that comparable quantum vulnerability is understandable, given how were essentially manipulating the essential units of existence atoms and even subatomic particles into becoming useful to us.
Useful quantum computing doesnt simply correlate with an increasing number of available in-package qubits (announcements of 1,000-qubit products based on neutral atom technology, for instance). But useful quantum computing is always stretched thin throughout its limits, and if it isnt bumping against one fundamental limit (qubit count), its bumping against another (instability at higher qubit counts); or contending with issues of entanglement coherence and longevity; entanglement distance and capability; correctness of the results; and still other elements. Some of these scalability issues can be visualized within the same framework of efficient data transit between different distributed computing units, such as cores in a given CPU architecture, which can themselves be solved in a number of ways, such as hardware-based information processing and routing techniques (AMDs Infinity Fabric comes to mind, as does Nvidia's NVLink).
This feature of quantum computing already being useful at the 133-qubit scale is also part of the reason why IBM keeps prioritizing quantum computing-related challenges around useful algorithms occupying a 100 by 100 grid. That quantum is already useful beyond classical, even in gate grids that are comparably small to what we can achieve with transistors, and points to the scale of the transition of how different these two computational worlds are.
Then there are also the matters of error mitigation and error correction, of extracting ground-truth-level answers to the questions we want our quantum computer to solve. There are also limitations in our way of utilizing quantum interference in order to collapse a quantum computation at just the right moment that we know we will obtain from it the result we want or at least something close enough to correct that we can then offset any noise (non-useful computational results, or the difference of values ranging between the correct answer and the not-yet-culled wrong ones) through a clever, groundbreaking algorithm.
The above are just some of the elements currently limiting how useful qubits can truly be and how those qubits can be manipulated into useful, algorithm-running computation units. This is usually referred to as a qubits quality, and we can see how it both does and doesnt relate to the sheer number of qubits available. But since many useful computations can already be achieved with 133-qubit-wide Quantum Processing Units (theres a reason IBM settled on a mere 6-qubit increase from Eagle towards Heron, and only scales up to 156 units with Flamingo), the company is setting out to keep this optimal qubit width for a number of years of continuous redesigns. IBM will focus on making correct results easier to extract from Heron-sized QPUs by increasing the coherence, stability, and accuracy of these 133 qubits while surmounting the arguably harder challenge of distributed, highly-parallel quantum computing. Its a onetwo punch again, and one that comes from the bump in speed at climbing ever-higher stretches of the quantum computing plateau.
But there is an admission that its a barrier that IBM still wants to punch through its much better to pair 200 units of a 156-qubit QPU (that of Flamingo) than of a 127-qubit one such as Eagle, so long as efficiency and accuracy remain high. Oliver Dial says that Condor, "the 1,000-qubit product", is locally running up to a point. It was meant to be the thousand-qubit processor, and was a part of the roadmap for this years Quantum Summit as much as the actual focus, Heron - but its ultimately not really a direction the company thinks is currently feasible.
IBM did manage to yield all 1,000 Josephson Junctions within their experimental Condor chip the thousand-qubit halo product that will never see the light of day as a product. Its running within the labs, and IBM can show that Condor yielded computationally useful qubits. One issue is that at that qubit depth, testing such a device becomes immensely expensive and time-consuming. At a basic level, its harder and more costly to guarantee the quality of a thousand qubits and their increasingly complex possibility field of interactions and interconnections than to assure the same requirements in a 133-qubit Heron. Even IBM only means to test around a quarter of the in-lab Condor QPUs area, confirming that the qubit connections are working.
But Heron? Heron is made for quick verification that its working to spec that its providing accurate results, or at least computationally useful results that can then be corrected through ZNE and other techniques. That means you can get useful work out of it already, while also being a much better time-to-market product in virtually all areas that matter. Heron is what IBM considers the basic unit of quantum computation - good enough and stable enough to outpace classical systems in specific workloads. But that is quantum computing, and that is its niche.
Heron is IBMs entrance into the mass-access era of Quantum Processing Units. Next years Flamingo builds further into the inter-QPU coupling architecture so that further parallelization can be achieved. The idea is to scale at a base, post-classical utility level and maintain that as a minimum quality baseline. Only at that point will IBM maybe scale density and unlock the appropriate jump in computing capability - when that can be similarly achieved in a similarly productive way, and scalability is almost perfect for maintaining quantum usefulness.
Theres simply never been the need to churn out hundreds of QPUs yet the utility wasnt there. The Canaries, Falcons, and Eagles of IBMs past roadmap were never meant to usher in an age of scaled manufacturing. They were prototypes, scientific instruments, explorations; proofs of concept on the road towards useful quantum computing. We didnt know where usefulness would start to appear. But now, we do because weve reached it.
Heron is the design IBM feels best answers that newly-created need for a quantum computing chip that actually is at the forefront of human computing capability one that can offer what no classical computing system can (in some specific areas). One that can slice through specific-but-deeper layers of our Universe. Thats what IBM means when it calls this new stage the quantum-centric supercomputing one.
Classical systems will never cease to be necessary: both of themselves and the way they structure our current reality, systems, and society. They also function as a layer that allows quantum computing itself to happen, be it by carrying and storing its intermediate results or knitting the final informational state mapping out the correct answer Quantum computing provides one quality step at a time. The quantum-centric bit merely refers to how quantum computing will be the core contributor to developments in fields such as materials science, more advanced physics, chemistry, superconduction, and basically every domain where our classical systems were already presenting a duller and duller edge with which to improve upon our understanding of their limits.
However, through IBMs approach and its choice of transmon superconducting qubits, a certain difficulty lies in commercializing local installations. Quantum System Two, as the company is naming its new almost wholesale quantum computing system, has been shown working with different QPU installations (both Heron and Eagle). When asked about whether scaling Quantum System Two and similar self-contained products would be a bottleneck towards technological adoption, IBMs CTO Oliver Dial said that it was definitely a difficult problem to solve, but that he was confident in their ability to reduce costs and complexity further in time, considering how successful IBM had already proven in that regard. For now, its easier for IBMs quantum usefulness to be unlocked at a distance through the cloud and its quantum computing framework, Quiskit than it is to achieve it by running local installations.
Quiskit is the preferred medium through which users can actually deploy IBM's quantum computing products in research efforts just like you could rent X Nvidia A100s of processing power through Amazon Web Services or even a simple Xbox Series X console through Microsofts xCloud service. On the day of IBM's Quantum Summit, that freedom also meant access to the useful quantum circuits within IBM-deployed Heron QPUs. And it's much easier to scale access at home, serving them through the cloud, than delivering a box of supercooled transmon qubits ready to be plugged and played with.
Thats one devil of IBMs superconducting qubits approach not many players have the will, funding, or expertise to put a supercooled chamber into local operation and build the required infrastructure around it. These are complex mechanisms housing kilometers of wiring - another focus of IBMs development and tinkering culminating in last years flexible ribbon solution, which drastically simplified connections to and from QPUs.
Quantum computing is a uniquely complex problem, and democratized access to hundreds or thousands of mass-produced Herons in IBMs refrigerator-laden fields will ultimately only require, well a stable internet connection. Logistics are what they are, and IBMs Quantum Summit also took the necessary steps to address some needs within its Quiskit runtime platform by introducing its official 1.0 version. Food for thought is realizing that the era of useful quantum computing seems to coincide with the beginning of the era of Quantum Computing as a service as well. That was fast.
The era of useful, mass-producible, mass-access quantum computing is what IBM is promising. But now, theres the matter of scale. And theres the matter of how cost-effective it is to install a Quantum System Two or Five or Ten compared to another qubit approach be it topological approaches to quantum computing, or oxygen-vacancy-based, ion-traps, or others that are an entire architecture away from IBMs approach, such as fluxonium qubits. Its likely that a number of qubit technologies will still make it into the mass-production stage and even then, we can rest assured that everywhere in the road of human ingenuity lie failed experiments, like Intels recently-decapitated Itanium or AMDs out-of-time approach to x86 computing in Bulldozer.
It's hard to see where the future of quantum takes us, and its hard to say whether it looks exactly like IBMs roadmap the same roadmap whose running changes we also discussed here. Yet all roadmaps are a permanently-drying painting, both for IBM itself and the technology space at large. Breakthroughs seem to be happening daily on each side of the fence, and its a fact of science that the most potential exists the earlier the questions we ask. The promising qubit technologies of today will have to answer to actual interrogations on performance, usefulness, ease and cost of manipulation, quality, and scalability in ways that now need to be at least as good as what IBM is proposing with its transmon-based superconducting qubits, and its Herons, and scalable Flamingos, and its (still unproven, but hinted at) ability to eventually mass produce useful numbers of useful Quantum Processing Units such as Heron. All of that even as we remain in this noisy, intermediate-scale quantum (NISQ) era.
Its no wonder that Oliver Dial looked and talked so energetically during our interview: IBM has already achieved quantum usefulness and has started to answer the two most important questions quality and scalability, Development, and Innovation. And it did so through the collaboration of an incredible team of scientists to deliver results years before expected, Dial happily conceded. In 2023, IBM unlocked useful quantum computing within a 127-qubit Quantum Processing Unit, Eagle, and walked the process of perfecting it towards the revamped Heron chip. Thats an incredible feat in and of itself, and is what allows us to even discuss issues of scalability at this point. Its the reason why a roadmap has to shift to accommodate it and in this quantum computing world, its a great follow-up question to have.
Perhaps the best question now is: how many things can we improve with a useful Heron QPU? How many locked doors have sprung ajar?
Visit link:
- Reaching absolute zero for quantum computing now much quicker thanks to breakthrough refrigerator design - Livescience.com - May 29th, 2024 [May 29th, 2024]
- Why These 3 Quantum Computing Stocks Are Worth the Risk - The Motley Fool - May 29th, 2024 [May 29th, 2024]
- D-Wave Quantum Set to Join Russell 3000 Index - HPCwire - May 29th, 2024 [May 29th, 2024]
- 3 Quantum Computing Stocks to Buy Be Millionaire-Makers: May - InvestorPlace - May 29th, 2024 [May 29th, 2024]
- Amazon taps Finland's IQM for its first EU quantum computing service - TNW - May 29th, 2024 [May 29th, 2024]
- Here Come the Qubits? What You Should Know About the Onset of Quantum Computing - IPWatchdog.com - May 29th, 2024 [May 29th, 2024]
- On Location at IBM Think 2024: The AI Revolution and Quantum Computing Advances - Acceleration Economy - May 29th, 2024 [May 29th, 2024]
- The power of Quantum Computing - The Cryptonomist - May 29th, 2024 [May 29th, 2024]
- ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda - HPCwire - May 29th, 2024 [May 29th, 2024]
- D-Wave Introduces New Fast Anneal Feature, Extending Quantum Computing Performance Gains - Yahoo Finance - April 20th, 2024 [April 20th, 2024]
- Major First: Quantum Information Produced, Stored, And Retrieved - ScienceAlert - April 20th, 2024 [April 20th, 2024]
- Horizon Quantum Computing to Establish First-of-a-Kind Hardware Testbed - The Quantum Insider - April 20th, 2024 [April 20th, 2024]
- Quantum Cloud Computing Secured in New Breakthrough at Oxford - TechRepublic - April 20th, 2024 [April 20th, 2024]
- Quantum Computing Could be the Next Revolution - Fair Observer - April 20th, 2024 [April 20th, 2024]
- Horizon Quantum Computing to Pioneer Multi-Vendor Quantum Hardware Testbed - HPCwire - April 20th, 2024 [April 20th, 2024]
- These 10 quantum computing companies have pulled in the most VC cash - PitchBook News & Analysis - April 20th, 2024 [April 20th, 2024]
- D-Wave Launches Fast Anneal Feature for Enhanced Quantum Computing Performance - Quantum Computing Report - April 20th, 2024 [April 20th, 2024]
- Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed - HPCwire - April 20th, 2024 [April 20th, 2024]
- Illuminating Futures: Celebrating Achievements and Exploring Quantum Computing at This is IT! Event - Royal Examiner - April 20th, 2024 [April 20th, 2024]
- Horizon Quantum Computing to Establish First-of-a-Kind Hardware Testbed - The Bakersfield Californian - April 20th, 2024 [April 20th, 2024]
- A Weakness in One of the NIST PQC Algorithms Was Not Uncovered After All - Quantum Computing Report - April 20th, 2024 [April 20th, 2024]
- Commodore 64 claimed to outperform IBM's quantum system sarcastic researchers say 1 MHz computer is faster ... - Tom's Hardware - April 20th, 2024 [April 20th, 2024]
- Quantum computing: a new frontier for the broadcast and media industry - RedShark News - April 20th, 2024 [April 20th, 2024]
- 3 Steps Businesses Should Take to Prepare for Quantum Computing Disruption - TechSpective - April 20th, 2024 [April 20th, 2024]
- Orientum Publishes 'Quantum Finance Algorithm' Paper on ArXiv - The Quantum Insider - April 20th, 2024 [April 20th, 2024]
- Quantum Linear Solvers for Redundant Baseline Calibration - AZoQuantum - April 20th, 2024 [April 20th, 2024]
- 'Almost very close' to nuclear weapon: Federal cyber officials brace for quantum computing surprise - Washington Times - April 20th, 2024 [April 20th, 2024]
- D-Wave fast anneal extends quantum computing performance ... - eeNews Europe - April 20th, 2024 [April 20th, 2024]
- Horizon Quantum Computing to Set Up Its Own Hardware Testbed for Tight Integration Between the Hardware and Software Stacks - Quantum Computing Report - April 20th, 2024 [April 20th, 2024]
- Global Quantum Technology Market Research 2024-2029 with Assessment of Companies Focused on Quantum ... - Daily Host News - April 20th, 2024 [April 20th, 2024]
- The experimental demonstration of a verifiable blind quantum computing protocol - Phys.org - April 20th, 2024 [April 20th, 2024]
- Researchers create 'quantum drums' to store qubits one step closer to groundbreaking internet speed and security - Tom's Hardware - April 20th, 2024 [April 20th, 2024]
- Access to burgeoning quantum technology field could be widened by educational model - Phys.org - April 20th, 2024 [April 20th, 2024]
- 'Quantum memory' could make the internet super fast and secure - Futurity: Research News - April 20th, 2024 [April 20th, 2024]
- Senate bill aims to bring more private sector participation to federal AI innovation - Nextgov/FCW - April 20th, 2024 [April 20th, 2024]
- Quantum Computing Leaps Forward with Groundbreaking Error Correction - yTech - April 4th, 2024 [April 4th, 2024]
- Microsoft and Quantinuum Pave the Way for Reliable Quantum Computing - yTech - April 4th, 2024 [April 4th, 2024]
- Breakthrough in Quantum Information Communication Achieved by Tokyo Researchers - yTech - April 4th, 2024 [April 4th, 2024]
- Microsoft Advances in Quantum Computing with Error-Reduction Breakthrough - yTech - April 4th, 2024 [April 4th, 2024]
- Quantinuum H2 Paves the Way for Reliable Quantum Computing - yTech - April 4th, 2024 [April 4th, 2024]
- Why Quantum Computers Will Never Break Bitcoin - Palm Beach Research Group - April 4th, 2024 [April 4th, 2024]
- Microsoft and Quantinuum boast quantum computing breakthrough - DIGIT.FYI - April 4th, 2024 [April 4th, 2024]
- Microsoft and Quantinuum announce breakthrough in quantum computing 14 thousand experiments without errors - ITC - April 4th, 2024 [April 4th, 2024]
- Revolutionizing Quantum Computing: Breakthroughs in Quantum Error Correction - AZoQuantum - April 4th, 2024 [April 4th, 2024]
- Quantum Computing Recharged With Electromagnetic Ion Trap Innovation - SciTechDaily - April 4th, 2024 [April 4th, 2024]
- Next-Generation Quantum Leap Achieved by Microsoft and Quantinuum - yTech - April 4th, 2024 [April 4th, 2024]
- Microsoft and Quantinuum announce development of next-generation technology that reduces 'noise' by 800 times ... - GIGAZINE - April 4th, 2024 [April 4th, 2024]
- BTQ Technologies Corp. Partners with the Australian Quantum Software Network to Advance Quantum Computing and ... - PR Newswire - April 4th, 2024 [April 4th, 2024]
- Quantinuum and Microsoft Leap towards Quantum Superiority with Noise Reduction Breakthrough - yTech - April 4th, 2024 [April 4th, 2024]
- The 3 Best Quantum Computing Stocks to Buy in Q2 2024 - InvestorPlace - April 4th, 2024 [April 4th, 2024]
- What Are the Implications of Quantum Computing for the Future of Data Security? - socPub - April 4th, 2024 [April 4th, 2024]
- Cosmic rays, XR, and 'multiverse' quantum computing welcome to EIC's deeptech Scaling Club - TNW - April 4th, 2024 [April 4th, 2024]
- Wall Street Favorites: 3 Quantum Computing Stocks with Strong Buy Ratings for February 2024 - InvestorPlace - February 26th, 2024 [February 26th, 2024]
- Never-Repeating Tiles Can Safeguard Quantum Information - Quanta Magazine - February 26th, 2024 [February 26th, 2024]
- Fractional Electrons: MIT's New Graphene Breakthrough Is Shaping the Future of Quantum Computing - SciTechDaily - February 26th, 2024 [February 26th, 2024]
- Qubits are notoriously prone to failure but building them from a single laser pulse may change this - Livescience.com - February 26th, 2024 [February 26th, 2024]
- New Phase of Matter Created During Experiments with Exotic Particles in Quantum Processor - The Debrief - February 26th, 2024 [February 26th, 2024]
- Harnessing the Power of Neutrality: Comparing Neutral-Atom Quantum Computing With Other Modalities - The Quantum Insider - February 26th, 2024 [February 26th, 2024]
- Apple is already defending iMessage against tomorrow's quantum computing attacks - The Verge - February 26th, 2024 [February 26th, 2024]
- Government of Canada Supports Xanadu to Accelerate Quantum Computing Research and Education - HPCwire - February 26th, 2024 [February 26th, 2024]
- U.S. weighs National Quantum Initiative Reauthorization Act - TechTarget - February 26th, 2024 [February 26th, 2024]
- The Current State of Quantum Computing - Securities.io - February 26th, 2024 [February 26th, 2024]
- Superconducting qubit promises breakthrough in quantum computing - Advanced Science News - February 26th, 2024 [February 26th, 2024]
- Quantum Computing Breakthrough: New Fusion of Materials Has All the Components Required for a Unique Type of ... - SciTechDaily - February 26th, 2024 [February 26th, 2024]
- 3 Quantum Computing Stocks That Could Be Multibaggers in the Making: February Edition - InvestorPlace - February 26th, 2024 [February 26th, 2024]
- DCD Podcast - The fundamentals of quantum computing, with Yuval Boger, QuEra - DCD - DatacenterDynamics - February 26th, 2024 [February 26th, 2024]
- Apple to launch PQ3 update for iMessage, bolstering encryption against quantum computing - ReadWrite - February 26th, 2024 [February 26th, 2024]
- Illinois governor's proposed $53B budget includes funds for migrants, quantum computing and schools - The Associated Press - February 26th, 2024 [February 26th, 2024]
- How is Quantum Technology Developing in Ireland? A Conversation with John Durcan, IDA Ireland - AZoQuantum - February 26th, 2024 [February 26th, 2024]
- Quantum Poker: The States of Colorado and Illinois are Betting on Quantum - Quantum Computing Report - February 26th, 2024 [February 26th, 2024]
- One of those transformational investments: $15M brings quantum computing to SC - WIS News 10 - February 26th, 2024 [February 26th, 2024]
- Apple is future-proofing iMessage with post-quantum cryptography - Cointelegraph - February 26th, 2024 [February 26th, 2024]
- Singapore warns banks to prepare for quantum computing cyber threat - Finextra - February 26th, 2024 [February 26th, 2024]
- New Superconducting Flowermon Superconducting Qubit Designed to Greatly Increase Coherence Times - Quantum Computing Report - February 26th, 2024 [February 26th, 2024]
- Apple Ramps Up iMessage Security to Fight Looming Quantum Computing Threat - PCMag - February 26th, 2024 [February 26th, 2024]
- IONQ Stock Outlook: Why This Quantum Computing Play Could Be a Long-Term Winner - InvestorPlace - February 26th, 2024 [February 26th, 2024]
- Apple future-proofing iMessage to protect against the scary future of quantum computing hacking - TechRadar - February 26th, 2024 [February 26th, 2024]
- Apple to upgrade iMessage with measures against future quantum computing hacking - The Indian Express - February 26th, 2024 [February 26th, 2024]
- FedDev Ontario invests $17 million in 12 companies to advance quantum computing - IT World Canada - February 26th, 2024 [February 26th, 2024]
- Apple Bolsters iMessage Encryption Amid Quantum Computing Threats - Telecom Lead - February 26th, 2024 [February 26th, 2024]