Impact Of Imperfect Timekeeping On Quantum Control And Computing – Hackaday

In classical control theory, both open-loop and closed-loop control systems are commonly used. These systems are well understood and rather straightforward, controlling everything from washing machines to industrial equipment to the classical computing devices that make todays society work. When trying to transfer this knowledge to the world of quantum control theory, however, many issues arise. The most pertinent ones involve closed-loop quantum control and the clocking of quantum computations. With physical limitations on the accuracy and resolution of clocks, this would set hard limits on the accuracy and speed of quantum computing.

The entire argument is covered in two letters to Physical Review Letters, by Florian Meier et al. titled Fundamental Accuracy-Resolution Trade-Off for Timekeeping Devices (Arxiv preprint), and by Jake Xuereb et al. titled Impact of Imperfect Timekeeping on Quantum Control(Arxiv preprint). The simple version is that by simply increasing the clock rate, accuracy suffers, with dephasing and other issues becoming more frequent.

Solving the riddle of closed-loop quantum control theory is a hard one, as noted by Daoyi Dong and Ian R Peterson in 2011. In their paper titled Quantum control theory and applications: A survey, the most fundamental problem with such a closed-loop quantum control system lies with aspects such as the uncertainty principle, which limits the accuracy with which properties of the system can be known.

In this regard, an accurately clocked open-loop system could work better, except that here we run into other fundamental issues. Even though this shouldnt phase us, as with time solutions may be found to the timekeeping and other issues, its nonetheless part of the uncertainties that keep causing waves in quantum physics.

Top image: Impact of timekeeping error on quantum gate fidelity & independent clock dephasing (Xuereb et al., 2023)

Visit link:
Impact Of Imperfect Timekeeping On Quantum Control And Computing - Hackaday

The 3 Best Quantum Computing Stocks to Buy in December – InvestorPlace

These quantum computing stocks promise to improve AI and have investors' attention

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Quantum computing is an emerging field of computer science that leverages classical physics and mathematics. The fields promise is simple: to increase the speed with which computers can do calculations. Thus, stocks in the field are highly attractive to investors in this increasingly digitized world.

The most important thing to understand here is the idea of qubits. Classical computers process information in bits, which are defined as zeros and ones. A qubit is essentially a quantum bit and can take on the properties of a zero or a one at different times.

Lets look at three quantum computing stocks in the sector.

Source: Shutterstock

Quantum Computing (NASDAQ:QUBT) It continues to develop quantum computing technologies and is a relatively inexpensive and high-risk stock. Shares trade at around 90 cents but, based on analyst projections, have the potential to Increase to $9. Its important to note that the sole analyst gave that $9 price target with the firms coverage.

The company is building what it refers to as quantum reservoir computers. The most important thing for investors is that those computers promise to bring quantum computing capabilities to fields including artificial intelligence. That means that the speed and efficiency of computing will increase while energy consumption will fall dramatically.This then makes QUBT one of those quantum computing stocks to consider.

So, theoretically, quantum computing makes a lot of sense for investors who hope to take advantage of the boom in artificial intelligence. However, practical, real-world limits need to be considered as well. primarily, Im referring to financial results. The company is still very young and reported revenues of $50,000 during the third quarter. That led to a loss of $8.3 million.

Source: Shutterstock

Although quantum computing continues to be a relatively young industry, IonQ (NYSE:IONQ) Has produced the sixth generation of quantum computers. The company has produced those six quantum computers since its inception in 2015. The company now believes that it is on a path that will lead to commercially scalable operations.

The company continues to concentrate on its trapped ion technology. briefly, that technology uses ions trapped in a vacuum chamber in which lasers are used to manipulate the state of the ions. This allows the ions to enter a quantum state, performing calculations quicker than In classical computing. This advantage makes IONQ one of those quantum computing stocks to consider.

IonQ benefits from substantial demand. The company initially aimed to achieve 100 million in cumulative bookings within the first 3 years of commercialization. CEO Peter Chapman reported that the company is on track to achieve that goal by the end of 2023. The company sold two such systems to the US Air Force research lab during the third quarter for $25.5 million.

However, the company could only recognize $6.1 million in revenue during the period due to the accounting for said bookings. That said, revenues increased by $122% in the third quarter.

Source: Asif Islam / Shutterstock.com

Microsoft (NASDAQ:MSFT) And most other Silicon Valley firms are also engaged in quantum computing development. Most of the major Tech firms continue to be strong Investments, and MSFT stock is no exception.

The company is engaged in quantum computing research, which shouldnt surprise anyone. The company is actively seeking employees for multiple roles within quantum computing research. Microsoft intends to create a scalable quantum computing system.

I know very little about this particular area of research, but its clear that Microsoft is focused on solving the fault tolerance problem. fault tolerance refers to the ability to prevent minor errors from spreading rapidly. In quantum computing, when a qubit erroneously takes on the value of one or zero, that can lead to a situation that results in an uncorrectable error.

There is little evidence that Microsoft is ahead of any of the other Silicon Valley firms in this regard. However, the company remains strong overall and will remain an excellent investment.

On the date of publication, Alex Sirois did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Alex Sirois is a freelance contributor to InvestorPlace whose personal stock investing style is focused on long-term, buy-and-hold, wealth-building stock picks. Having worked in several industries from e-commerce to translation to education and utilizing his MBA from George Washington University, he brings a diverse set of skills through which he filters his writing.

More:
The 3 Best Quantum Computing Stocks to Buy in December - InvestorPlace

IBM quantum computing updates: System Two and Heron – The Verge

Today, Im talking with Jerry Chow. Hes the director of quantum systems at IBM, meaning hes trying to build the future one qubit at a time.

IBM made some announcements this week about its plans for the next 10 years of quantum computing: there are new chips, new computers, and new APIs. Youll hear us get into more of the details as we go, but the important thing to know upfront is that quantum computers could have theoretically incredible amounts of processing power and could entirely revolutionize the way we think of computers if, that is, someone can build one thats actually useful.

Heres Jerry, explaining the basics of what a quantum computer is:

A quantum computer is basically a fundamentally different way of computing. It relies on the laws of quantum mechanics, but it just changes how information is handled. So instead of using bits, we have quantum bits or qubits.

A regular computer the quantum folks call them classical computers like an iPhone or a laptop or even a fancy Nvidia GPU works by encoding data in bits. Bits basically have two states, which we call zero and one. Theyre on or theyre off.

But the laws of quantum mechanics that Jerry just mentioned mean that qubits behave very, very differently. They can be zero or one, but they might also be a whole lot of things in between.

You still have two states: a zero and a one. But they can also be in superpositions of zero and one, which means that theres a probability that when you measure it, it will be zero or one with particular probability. In terms of how we physically build these, theyre not switches anymore, theyre not transistors, but theyre actually elements that have quantum mechanical behavior.

One of my favorite things about all this is that in order to make these new quantum computers work, you have to cool them to within fractions of a degree of absolute zero, which means a lot of companies have had to work very hard on cryogenic cooling systems just so other people could work on quantum chips. Jerry calls early quantum computers science projects, but his goal is to engineer actual products people can use.

Youll hear Jerry talk about making a useful quantum computer in terms of utility, which is when quantum computers start to push against the limits of what regular computers can simulate. IBM has been chasing after utility for a while now. It first made quantum computers available on the cloud in 2016, its shipped System One quantum computers to partners around the world, and now, this week, its announcing System Two along with a roadmap for the future. Its Decoder, so I asked Jerry exactly how he and his team sit down and build a roadmap for the next 10 years of applied research in a field that requires major breakthroughs at every level of the product. Oh, and we talked about Ant-Man.

Its a fun one very few people sit at the bleeding edge all day like Jerry.

Okay. Jerry Chow, director of quantum systems at IBM. Here we go.

This transcript has been lightly edited for length and clarity.

Jerry Chow, you are an IBM fellow and director of quantum systems. Welcome to Decoder.

Im really excited to talk to you. Theres quite a lot to talk about quantum computing in general, where it is. But youve got some news to announce today, so I want to make sure we talk about the news right off the bat. What is going on in IBM Quantum?

Listen to Decoder, a show hosted by The Verges Nilay Patel about big ideas and other problems.Subscribehere!

Yeah, so we have our annual Quantum Summit coming up, where we basically invite our network of members and users to come, and we talk about some of the really exciting news. What were announcing this year is actually we have a really exciting upgraded quantum processor that were talking about. Its called the IBM Quantum Heron. It has 133 qubits. Its the highest performance processor that weve ever built, and its going to be available for users to access via our cloud services.

Were also launching IBM Quantum System Two and introducing this as a new architecture for scaling our quantum computers into the future. Were also talking about a 10-year roadmap looking ahead. We, at IBM Quantum, like to sort of call our shots, tell everyone what were doing because that keeps us honest, keeps everyone in the industry on the same benchmark of seeing what progress is. And were expanding that roadmap, which we actually first introduced a couple of years ago and have hit all our milestones thus far. But we are extending it out to 2033, pushing forward into this next realm where we really want to drive toward pushing quantum computing at scale.

So youve got a new processor, youve got a new computing architecture in System Two, youve got a longer roadmap. Put that in context for me: weve been hearing about quantum computing for quite a long time. I have stared at a number of quantum computers and been told, This is the coldest piece of the universe that has ever existed. Its been very entertaining, at the very least. Were only now at the point where were actually solving real problems with quantum computers.

Were not even at the point of solving real problems.

Not yet. But we are, really excitingly, just this past year, at the point where were calling this utility-scale quantum computing. Were using 100-plus qubits. We used a processor earlier in the year called Eagle, where we were able to look at a particular problem that you couldnt really solve with brute-force methods using a classical computer, but also it challenged the best classical approximation methods that are used on high-performance computing. So whats interesting there is that now the quantum computer becomes like the benchmark. You almost need it to verify whether your approximate classical methods are working properly. And that just happens when you go over 100 qubits.

At 100 qubits, things all change so that you just cant use, say, GPUs or any kind of classical computers to simulate whats going on accurately. This is why were in this phase where we call it utility scale because theres going to be this back and forth between using a quantum as a tool compared with what you can still potentially do in classical. But then theres a long road there that were going to try to drive value using the quantum to get toward quantum manage.

I think the word utility there threw me off. This is the branch point where the problems you solve with a quantum computer start to become meaningfully different than the problems you could solve with a regular computer.

Thats right. We see this really as an inflection point. There are a lot of industries that use high-performance computation already, and they are looking at very, very hard problems that use the Oak Ridge supercomputers and whatnot. And now quantum becomes an additional tool that opens up a new lens for them to look at a different area of compute space that they werent able to look at before.

So IBM has a huge program in quantum. The other big companies do, too Microsoft, Google, what have you, theyre all investing in this space. Does this feel like a classical capitalist competition, Were all racing forward to get the first product to market? Is it a bunch of researchers who know that theres probably a pot of gold at the end of this rainbow, but were nowhere close to it yet, so were all kind of friendly? Whats the vibe?

Id say that its a very exciting time to be in this field. How often do you get to say youre building from the ground floor of a completely new computational architecture? Something that is just fundamentally different from traditional classical computing. And so yeah, Id say that theres certainly a lot of groundswell, theres a lot of buzz. Sometimes a little too much buzz, maybe. But also I think from the perspective of competition, it helps drive the industry forward.

We, at IBM, have been at the forefront of computation for decades. And so its in our blood. The ideas of roadmaps and pushing the next big development, the next big innovations in computation, have always been something that is just native to IBM, and quantum is no different. Weve been in the game with quantum since the early theoretical foundings for probably 30 years, 30-plus years. But now were really starting to bear a lot of that fruit in terms of building the architectures, building the systems, putting out the hardware, developing the framework for how to make it usable and accessible.

Let me give you just a much dumber comparison. We had the CEO of AWS on the show, Adam Selipsky. AWS is furiously competitive with Microsoft Azure and Google Cloud. They are trying to take market share from each other, and they do a lot of innovative things to make better products, but the end goal of that is taking one customer away from Google. Youre not there yet, right? Theres not market share to be moved around yet?

Certainly not at that scale.

But are there quantum customers that you compete for?

Theres certainly a growing quantum community.

[Laughs] Its not a customer; there are people who are interested.

At 100 qubits, things all change

There are people that are interested across the board, from developers, to students, to Fortune 500 companies. We have a lot of interest. So just as an example, we first put systems on the cloud in 2016. We put a very simple five-qubit computer, five-qubit quantum computer, on the cloud. But it reflected a real fundamental shift in how quantum could be approached. Before, you had to be sort of a physicist. You had to be in a laboratory turning knobs. Youre taking data, youre running physicist code; youre not programming a computer.

Wow. [Laughs] Shout out to physicists.

Well, Im a physicist, and you dont want to see my code. [Laughs] But the whole point is that we developed a whole framework around it to actually deploy it and to make it programmable. And think about the early days of computers and all the infrastructure you needed to build in terms of the right assembly language and compilers and the application layers all above that. Weve been building that for the last seven years since that first launched. And in that time, weve had over 500,000 users of our platform and of our services.

Im always curious how things are structured and how decisions are made. Thats really what we talk about on the show. And theres a forcing function that comes when its a business, and theres a growth path. Quantum seems very much like one day it will be a huge business because it will solve problems that regular computers cant. But right now, its on the very early part of the curve where youre investing a lot into R&D, on an aggressive roadmap, but youre nowhere close to the business yet.

I would say that were knocking on the door of business value and looking for that business value, because especially when were in this realm where we know that it can be used as a tool pitted against the best classical computers, theres something there to be explored. A lot of times, even with traditional computers, there are very few proven algorithms that are where we drive all the value. A lot of the value that gets driven is done through heuristics, through just trial and error, through having the tool and using it on a particular problem. Thats why we see this fundamental game-changer of this inflection point going toward utility scale systems of over 100 qubits as now this is the tool that we want users to actually go and find business advantage, find the problems that map appropriately onto these systems for exploration.

So put that in the context of IBM. IBMs a huge company, its over 100 years old, it does a lot of things. This is probably the most cutting-edge thing IBM is doing, I imagine. Im guessing youre not going to disagree with me. But it feels like the most cutting-edge thing that most of the Big Tech companies are doing.

How is that structured inside of IBM? How does that work?

So were IBM Quantum within IBM Research. IBM Research has always been the organic growth engine for all of IBM. Its where a lot of the innovative ideas come in, but overall, a particular strategy within IBM and IBM Research is that were not just doing research and then were going to do development and then its going to go on this very linearized product journey. Its all integrated together as we are moving forward. And so therefore, we have the opportunity within IBM Quantum that were developing products, were putting it on the cloud, were integrating with IBM Cloud. Were actually pushing these things forward to build that user base, build that groundswell, before all the various different technology elements are finished. Thats sort of this agile methodology of building this from the ground up, but also getting it out early and often to drive excitement and to really build up the other parts of the ecosystem.

So how is IBM Quantum structured? How many people is it? How is it organized?

So we dont speak explicit numbers, but we have several hundred people. And then we have parts of the team which are focused on the actual hardware elements, all the way down to the actual quantum processor and the system around it in terms of making those processors function by cooling it down in the cryogenic system, talking to it with control electronics, talking to it with classical computing. So it all needs to tie together.

Then you have software development teams. We also have a cloud and services team that helps to deliver our offerings as a service. And then we have applications teams looking at the next algorithms, the next novel ways of making use of our quantum services. We also have teams that are more outward-looking for business development trying to drive adoption, working with various clients to engage in the problems of their interests. We also have a part of our team which runs an offering called the Quantum Accelerator. Its like a consulting arm, working with the clients to get quantum-ready, start understanding how their problems can be impacted by quantum computing and start using our systems.

Is that all flat? Every one of those teams reports to you, or is there structure in between?

No, so all those different ones report to our vice president of quantum computing, which is Jay Gambetta. I take care of the systems part. Basically, the wrapping of the processor and how it runs, executing problems for the users, thats the piece that I own.

Theres a tension there. It sounds like IBM is designed to attack this tension head-on, which is: Were doing a bunch of pure research in cryogenics to make sure that quantum computing can run because it has to be really cold to run. Then theres a business development team thats just off and running, doing sales stuff, and at some point theyre going to come back and say, We sold this thing. And the cryogenics team is going to say, Not yet. Every business has a problem like that. When youre in pure research mode, the not yet is a real problem.

How often do you run into that?

We have a very good strategy across the team. We know our core services and what the core product we have is. And also we have a roadmap. The concept of the roadmap is both great for the R&D side but also great for the client perspective, business development angle view of seeing whats coming next. From the internal side, we know weve got to continue to drive toward this, and these are our deliverables and these are the new innovations that we need to do. In fact, in our new roadmap that were releasing, we have that separated. Both a development roadmap, which is more product focused and more like what the end users going to get and clients going to get. And we have an innovation roadmap to show those things which were still going to need to turn to crank and figure out what feeds in.

I often say the roadmap is our mantra, and it really is our calling card both internally and externally. Not many people really show a lot of detail in their roadmap, but it serves as a guiding tool for us all.

I was looking at that roadmap, and it is very aggressive. Were at Heron, there are many birds to come from what I understand. And the goal is that a truly functional quantum computer needs thousands or millions of qubits, right?

We have a transition toward what we are calling quantum at scale, which I think what youre referring to is when you will get to the point where you can run quantum error correction, correct for all the errors that are underlying within these qubits, which are noisy. People throw around that number millions of qubits in a way that almost drives fear into the hearts of people. One actually really exciting thing that weve done this past year is weve developed a set of novel error correction codes that brings down that resource count a lot.

So actually, youll need potentially hundreds of thousands of qubits, 100,000 qubits or so, to build a fault-tolerant quantum error-correction-based quantum computer of a particular size to do some of those problems that were talking about at scale. And thats part of the roadmap, too. So thats what were looking at further to the Blue Jay system in 2033. So theres certainly a number of birds to get there, but we have concrete ideas for the technological hurdles to overcome to get there.

Thats the goal. Youre going to get to some massively larger scale than you are today. Orders of magnitude. Today the chip has 133 qubits, you need to get to thousands. Some people, terrifyingly, are saying millions.

Part of your strategy is linking the chips together into these more modular systems and then putting control circuitry around them. Im a person who came up in what you might call the classical computing environment, thats very familiar. Thats a very familiar strategy; were just going to do more cores. Thats what that looks like to me. Lots of companies have run up against a lot of problems here. In that part of the world, theres just Moores law, and we sit around talking about it all day long. And Nvidia and maybe TSMC have gotten over it this time, and Intel has struggled to get the next process node and increase the transistor density. Is there an equivalent to Moores law in quantum that you were thinking about?

Our roadmap is showing that type of progression.

I look at that roadmap, and you are definitely assuming a number of breakthroughs along the way in a way that Intel just assumed it for years and years and they achieved it, and then kind of hit the end of the road.

Even where we are today with Heron, and actually complementary to Heron this year, we also already built a 1,000-qubit processor, Condor. Its explicit goal was to push the limits of how many qubits could we put on a single chip, push the limits of how much architecture could we put in an entire system. How much could we actually cool down in the dilution refrigerators that we know today, the cryogenic refrigerators that we have today? Push the boundaries of everything to understand where things break. And if you look at the early part of our roadmap, the birds are there with various technological hurdles that weve already overcome to get toward this thousand-qubit level. And now those next birds that you see in the rest of the innovation roadmap are different types of couplers, different types of technologies, that are those technological hurdles, like in semiconductors, that allow us to bridge the gap.

Are they the same? Is it the same kind of, We need to double transistor density, or is it a different set of challenges?

Id say, the decades of experience matter

Theyre different, because with this sort of modular approach, theres some that are like, how many can we place into a single chip? How many can we place into a single package? How many can we package together within the system? So they all require slightly different technological innovations within the whole value chain. But we dont see them as not doable; we see them certainly as things that we will handle over the next few years. Were already starting to test linking between two packages via a cryogenic cable. This is toward our Flamingo demonstration, which were planning for next year.

Do you get to leverage any of the things that are happening on the process side with classical computers?

Like TSMC hits three nanometers and you get to pull that forward, or is that different?

Not so explicitly to the newest stuff thats happening today in semiconductors. But IBM has been in the semiconductors game for many, many decades. And a lot of the work that weve achieved with even achieving a 100 qubits with Eagle a couple of years ago was because we had that deep-rooted semiconductor background. So just to give you an example, at 100 qubits, the challenge is how do you actually wire to 100 qubits in a chip? The standard thing you do in semiconductors is you go to more layers, but its not so easy to do that just in these superconducting quantum circuits because they might mess up the qubits. It might cause them to decohere.

But because of our know-how with packaging, we found the right materials, we found the right way of using our fabrication techniques to implement that type of multilayer wiring and still talk to these 100 qubits. We evolved that further this past year to actually get to 1,000. And so that type of semiconductor know-how is just ingrained and something that is, Id say, the decades of experience matter.

So youre going to build the next-generation quantum computing chip, Heron. Its got 133 qubits. How is that chip manufactured?

Okay. Well, to build the next-generation quantum computing chip, we rely on advanced packaging techniques that involve multiple layers of superconducting metal to package and to wire up various superconducting qubits. With Heron, were also using a novel tunable coupler architecture, which allows us to have world-record performing two-qubit gate qualities. And all this is done in a standard fabrication facility that we have at IBM and package up this chip, and we have to cool it down into a cryogenic environment.

So silicon goes in one side of the building, Heron comes out the other?

I mean, certainly more steps than that. [Laughs] And theres this know-how of how to do it properly to have high-performing qubits, which weve just built up.

Explain to me what a high-performing qubit is.

Yeah, so the tricky thing with these qubits There are different ways of building qubits. There are people who use ions and atoms and electrons and things like that, but ours are actually just metal on a substrate; theyre circuits. Theyre much like the circuits that you might see when you look inside of a standard chip. But the problem with these circuits is that you can build, so you can basically arrange them in a certain way and use the right materials. And you have a qubit that, in this case, for superconducting qubits, it resonates at five gigahertz.

If you choose the wrong materials, the lifetimes of these qubits can be extremely short. So when we first started in the field of building superconducting qubits in 1999, superconducting qubits lasted for maybe like two nanoseconds, five nanoseconds. Today, weve gotten up to close to a millisecond, hundreds of microseconds to a millisecond. Already in numbers orders of magnitude longer. But that took many years of development. And at the point of a few hundred microseconds, were able to do all these complex operations that weve been talking about to push this utility scale that we discussed earlier. So that know-how to increase that lifetime comes down to engineering, comes down to understanding the core pieces that generate loss in the materials, and thats something that we certainly have expertise at.

Tell me about the industry at large. So IBM has one approach: you said youre using metals on a substrate. Youre leveraging all of the semiconductor know-how that IBM has. When youre out in the market and youre looking at all your competitors, Microsoft is doing something else, Google something else. Go through the list for me. What are the approaches, and how do you think theyre going?

When we think about competitors, you can think about the platform competitors of whos building the services, but I think what youre pointing to more is the hardware side.

When it comes down to it, theres a simple set of metrics for you to compare the performance of the quantum processors. Its scale: what number of qubits can you get to and build reliably? Quality: how long do those qubits live for you to perform operations and calculations on? And speed: how quickly can you actually run executions and problems through these quantum processors? And that speed part is something where its an interplay between your quantum processor and your classical computing infrastructure because they talk to one another. You dont control a quantum computer without a classical computer. And so you need to be able to get your data in, data out and process it on the classical side.

So scale, quality, speed. Our approach with superconducting qubits, to the best of our knowledge, we can hit all three of those in a very strong way. Scale, pushed up to over 1,000 qubits. We know that we can build up to 1,000 qubits already with the technologies that weve built. From the quality, Heron which were releasing has the best gate quality. So the gates, the operations, the gate qualities that have been shown across a large device. And then speed, in terms of just the execution times, were on the order of microseconds for some of the clock rates, whereas other approaches can be a thousand orders of magnitude slower.

What are the other approaches in the industry that you see, and where are they beating you and where are you ahead?

So there are trapped ions: basically theyre using molecular ions like caesium and things that you might use for clocks, atomic clocks. They can have very good quality. In fact, there are some results that have tremendous performance across a number of those types of trapped-ion qubits in terms of their two-qubit gate qualities. But theyre slow. In terms of the clock rates of getting your operations in, getting your operations out, you do operations to recycle the ion sometimes. And thats where it, Id say, has a downside.

Id say, right now, superconducting qubits and trapped ions are the approaches that have the most prominence at the moment that have been put out in terms of usable services. Atoms have also emerged; its very similar to the trapped ions. There, they use these fun little things called optical tweezers to hold atoms into little arrays. And there are some exciting results that have been coming out from various atom groups there. But again, it comes down to that speed. Anytime you have these actual atomic items, either an ion or an atom, your clock rates end up hurting you.

Alright, let me make a comparison to semiconductors again. So in semiconductors there was multiple pattern lithography that everyone chased for a minute, and it hit an end state. And then TSMC had bet really big on EUV and that let them push ahead. And Intel had to make a big shift over there. Youre looking at your roadmap, youre doing superconductors, cryogenics, metals on substrates, and over here some guys are doing optical tweezers on atoms. Is there a thought in your head like, We better keep an eye on this because that might be the process innovation that we actually need?

I think overall, in the landscape, were always keeping track of whats going on. Youre always seeing what are the latest innovations in the various different technologies.

Is that even a good comparison to semiconductors in that way?

The whole systems are completely different. The architectures are not so compatible. At some level, with your nodes of your semiconductors, there might be certain kinds of know-how that translate how you route and layout, maybe. And here, above a certain layer, theres also going to be commonality in terms of the compute platform, how the quantum circuits are generated. The software layers might be similar, but the actual physical hardware are very different.

It feels like the thing were talking about is how do you make a qubit? And its not settled yet. You have an approach that youre very confident in, but theres not a winner in the market.

I mean, were pretty confident. Were pretty confident in superconducting qubits.

Fair enough. [Laughs] I was just wondering.

Its why were able to prognosticate 10 years forward, that we see the direction were going. And to me its more that there are going to be innovations within that are going to continue to compound over those 10 years, that might make it even more attractive as time goes on. And thats just the nature of technology.

Youve got to make decisions on maybe the longest timeline of anyone Ive had on the show. Its always the chip people who have the longest timelines. I talk to social media CEOs, and its like their timeline is like five minutes from now, like, What are we going to ban today? Thats the timeline. I talk to chip folks, and your timelines are decades. You just casually mentioned a chip youre going to ship in 2033. Thats a long time from now. How do you make decisions on that kind of timeline?

Theres the near-term stuff, obviously, and the roadmap serves as that guide. That roadmap is constructed so that all these various things do impact that long-term delivery.

Just walk me through: What does the quantum computing roadmap meeting look like? Youre all in a conference room, are you at the whiteboard? Paint the picture for me.

Its mainly an inertia thing to move entire industries, move banks, move commerce, to adopt those standards

Yeah, that is a great question. I mean, we have a number of us who are sitting there. We certainly know that we have certain types of technical foundations that we know that we need to include into these next-generation chips and systems.

For this roadmap, we said, We know at some point we need to get quantum error correction into our roadmap. And with that technical lead, we know what are the requirements? So first we said, Okay, lets put it here. Now lets work backward. It says that we need to do this innovation and this innovation by this date, and this other innovation in the software stack or whatever by this date. And then we say, Oh shoot, we ran out of time. Lets move back a little bit. And so we do a little bit of that planning, because we also want to do it so that we lay out this roadmap that we often call no-regrets decisions. We dont want to do things that are just for the near term. We want to really pick strategies that give us this long-term path.

Its why we talk about utility scale so much in terms of what we can do with Herons and soon Flamingos. But everything that we want to build on top of what we can do there will translate to what we can do when we get those systems at scale, including error correction. And in terms of the roadmap planning Were not done, by the way. We have this overall framework for the 10-year roadmap, and then we need to refine. Weve got a lot of details still to come to work on in terms of what are those things that need to be worked on across the software layer, the compiler layer, the control electronics layer, and certainly at the processor layer.

Is there commercial pressure on this? Again, this is a lot of cost at a big public company. Is the CEO of IBM in that room saying, Whens this going to make money? Move it up?

I think the point is, our mission is to bring useful quantum computing to the world. Ive been working in this area for 20 years now. Weve never been this close to being able to build something that is driving real value. And so I think when you look at our team, we are all aligned along that mission. That we want to drive this to something that We started with just getting it out there in the cloud in terms of building the community. Now, we fundamentally see this as a tool that will alter how users are going to perform computation. And so there has to be, and I expect there to be, value there. And weve seen how the HPC community has progressed and weve seen how supercomputing has... You could see whats happening with the uptake of AI and everything. We build it, we will build the community around it, well drive value.

Lets talk about AI for a second. This is a really good example of this. AI demand is through the roof. The industry is hot. Well see if the products are long lasting, but there seems to be real consumer demand for them. And that is all translated into a lot of people want a lot of Nvidia H100 chips. Its very narrowly focused on one kind of processor. Do you see quantum systems coming into that zone where were going to run a lot of AI workloads on them? Like future AI workloads.

Whats happened in AI is phenomenal, but were not at the point where the quantum computer is this commodity item that were just buying tons of chips. Youre not fabricating millions of these chips. But we are going to build this supercomputer based off of quantum computing, which is going to be exquisitely good at certain types of tasks. And so the framework that I actually see is already youre going to have your AI compute clusters. The way that people run workloads today, Im sure they are running some parts on their regular computers, on their own laptops, but parts of the job get fed out to the cloud, to their hyperscalers, and some of them are going to use the AI compute nodes.

We see that also for how quantum will feed in. Itll be another part of that overall cloud access landscape where youre going to take a problem, youre going to break it down. Youre going to have parts of it that run on classical computing, parts of it that might run on AI, parts of it that will leverage what we call quantum-centric supercomputing. Thats the best place to solve that one part of the problem. Then it comes back, and youve got to stitch all that together. So from the IBM perspective, where we often talk about hybrid cloud, thats the hybrid cloud that connects all these pieces together. And differentiation is there in terms of building this quantum-centric supercomputer within there.

So your quantum-centric supercomputer in the cloud. Weve talked a lot about superconducting now. You need a data center thats very cold. This does not seem like a thing thats going to happen locally, for me, ever, unless LK-99 is real. This isnt going to happen for anyone in their home outside of an IBM data center for quite some time.

I would say this. So when I was first working in this area and did my PhD in this area I worked on superconducting qubits we required these large canisters, these refrigerators, where we need to wheel up these huge jugs of liquid helium and fill them every three days to keep them cold. Now, thats a physics experiment. I mean, there have already been innovations in cryogenics that theyre turnkey: you plug them in, they stay running, they can run for years and maintain your payloads at the right temperatures. Youre paying electricity, obviously, to keep them cold. But were seeing innovations there, too, in terms of driving infrastructure-scale cryogenics. Honestly, were going to evolve the data center of the future, just like data centers today have evolved to handle increased compute resources needed. We will work hand in hand with how to build these quantum data centers, and were already doing that. So we have a quantum data center up in Poughkeepsie, which hosts the majority of our systems, and were planning on expanding that further.

Original post:
IBM quantum computing updates: System Two and Heron - The Verge

Airbus and BMW Group launch Quantum Computing Competition to tackle their most pressing mobility challenges. – BMW Press

Santa Clara (CA), 6 December 2023 - Airbus and BMW Group launch a global Quantum Computing Challenge entitled The Quantum Mobility Quest to tackle the most pressing challenges in aviation and automotive that have remained insurmountable for classical computers.

This challenge is the first-of-its-kind, bringing together two global industry leaders to harness quantum technologies for real-world industrial applications, unlocking the potential to forge more efficient, sustainable and safer solutions for the future of transportation.

"This is the perfect time to shine a spotlight on quantum technology and its potential impact on our society. Partnering with an industry leader like BMW Group enables us to mature the technology as we need to bridge the gap between scientific exploration and its potential applications. Were seeking the best-in-class students, PhDs, academics, researchers, start-ups, companies, or professionals in the field, worldwide to join our challenge to create a massive paradigm shift in the way aircraft are built and flown." says Isabell Gradert, Vice President Central Research and Technology at Airbus.

Following the success of previous editions of Quantum Computing Challenges by BMW Group and Airbus, we are gearing up for a new wave of innovation, exploring the technology capabilities for sustainability and operational excellence. said Dr. Peter Lehnert, Vice-President, Research Technologies at BMW Group. The BMW Group is clearly aiming at positioning itself at the crossroads of quantum technology, the global ecosystem, and cutting-edge solutions. By doing so, we strongly believe in major advances when it comes to sustainable materials for batteries and fuel cells, to generate unique and efficient designs, or to enhance the overall user experience in the BMW Group Products.

Quantum computing has the potential to significantly enhance computational power and to enable the most complex operations that challenge even todays best computers. In particular, for data-driven industries like the transportation sector, this emerging technology could play a crucial role in simulating various industrial and operational processes, opening up opportunities to shape future mobility products and services.

Challenge candidates are invited to select one or more problem statements: improved aerodynamics design with quantum solvers, future automated mobility with quantum machine learning, more sustainable supply chain with quantum optimisation, and enhanced corrosion inhibition with quantum simulation. Additionally, candidates can put forward their own quantum technologies with the potential to develop native apps yet to be explored in the transportation sector.

The challenge is hosted by The Quantum Insider (TQI) and divided into two parts, a four-month phase where participants will develop a theoretical framework for one of the given statements, and a second phase during which selected finalists will implement and benchmark their solutions. Amazon Web Services (AWS) provides candidates with an opportunity to run their algorithms on their Amazon Braket quantum computing service.

A jury composed of world-leading quantum experts will team up with experts from Airbus, BMW Group, and AWS to evaluate submitted proposals and award one winning-team with a 30,000 prize in each of the five challenges, by the end of 2024.

Registration opens today, and submissions will be accepted from mid-January through April 30, 2024 here: http://www.thequantuminsider.com/quantum-challenge.

If you have any questions, please contact:

Press and Public Relations Janina LatzaSpokesperson BMW Group IT Tel.: +49 (0)151 601 12650 E-Mail: Janina.Latza@bmw.de

Christophe Koenig Leiter BMW Group IT, Digital and Driving Experience Communications, BMW Group Design, Innovations and Digital Car Communications Telefon: +49-89-382-56097 E-Mail: Christophe.Koenig@bmwgroup.com

Visit link:
Airbus and BMW Group launch Quantum Computing Competition to tackle their most pressing mobility challenges. - BMW Press

POSCO Holdings and QC Ware Revolutionize Battery Simulation with Quantum Computing – PR Newswire

SEOUL, Republic of Korea and PALO ALTO, Calif., Dec. 7, 2023 /PRNewswire/ -- POSCO Holdings, and QC Ware Corp., today announced that they are jointly developing revolutionary new techniques for the simulation of battery materials on quantum computers.

POSCO Holdings and QC Ware revolutionize battery simulation with quantum computing.

Proliferation of electric vehicles, growing energy requirements, and the imperative for sustainability are continuing to drive demand for batteries that last longer and require less time to charge. Design of new battery materials involves experimental production and testing, which are both costly and time-consuming. Material simulations could significantly accelerate the design process by predicting the most promising candidates before any experiment is conducted. However, current methods on classical computers suffer from either limited accuracy or excessive computational cost.

POSCO Holdings and QC Ware have joined forces on a grant from the Korean government to quantify the utility and advantage of quantum computers for the accurate and efficient simulation of candidate battery materials. The collaboration will concentrate on the simulation of realistic solid state electrolytes for Lithium batteries and benchmark new quantum computing methods vs the best approaches currently in use today.

This research is supported by the National Research Foundation of Korea (NRF) of the Ministry of Science and ICT (RS-2023-00257288). Earlier in the year, POSCO Holdings applied for the 'Quantum Advantage Challenge Research based on Quantum Computing' grant under the project titled 'Development of Simulation Technology for Eco-Friendly Material Based on Quantum Computing'.

The collaboration is spearheaded by the AI R&D Laboratories of POSCO Holdings New Experience of Technology Hub with the directive to apply new approaches of simulating battery materials to quantum computers.

"With the world moving toward diverse and flexible energy solutions, it is essential to develop more performant batteries to be integrated in future, sustainable energy grids." said Robert Parrish, SVP of Quantum Chemistry at QC Ware Corp. "Computational simulations are playing a growing role in the design of new materials, and this collaboration with POSCO Holdings is essential to QC Ware's mission: developing quantum algorithms that accelerate the timeline to quantum computers impacting real-world use cases."

POSCO Holdings

POSCO Group, which was launched in 1968 as a steel company, switched to a holding company system centered on POSCO Holdings in March last year. Since then, steel, rechargeable battery materials, lithium and nickel, hydrogen, energy, construction/infrastructure, and food (Agri-Bio) have been selected as seven key projects to discover the group's future growth engines and foster its business portfolio. Based on this, POSCO Group will grow into a leading supplier of eco-friendly future materials that ushers in a sustainable future.

QC Ware

QC Wareis a quantum and classical computing software and services company focused on delivering enterprise value through cutting edge computational technology. With specialization in machine learning and chemistry simulation applications, QC Ware develops for both near-term quantum and state-of-the-art classical computing hardware. QC Ware's team is composed of some of the industry's foremost experts in quantum and classical computing. QC Ware is headquartered in Palo Alto, California, and supports its European customers through its subsidiary in Paris and customers in Asia through its business development office in Tokyo, Japan. QC Ware also organizes Q2B, a global series of conferences for industry, practitioner, and academic quantum computing communities.

SOURCE QC Ware Corp.

Read more here:
POSCO Holdings and QC Ware Revolutionize Battery Simulation with Quantum Computing - PR Newswire

Has IBM cracked the code of quantum computing by solving data errors? – Euronews

One of the main issues in developing the machines is they often struggle with data errors. But IBM says its chips could make a difference.

Technology giant IBM has reached a major milestone in its quantum ambitions and has unveiled a new chip and machine that it hopes can help solve problems beyond the scope of traditional computers.

The unveiling at an IBM event in New York on Monday comes as companies and countries race to develop quantum machines, which can carry out large numbers of calculations simultaneously and at incredible speeds.

The new chip has more than 1,000 qubits, which is the equivalent of the digital bits in an ordinary computer.

One of the main issues in developing the machines is they often struggle with data errors. However, IBM said it has a new method to connect chips inside machines which can then connect machines and with a new error-code connection could produce even more capable quantum machines in 10 years.

The first machine to use them is called Quantum System Two, which uses three so-called "Heron" chips.

"We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science," said Dario Gil, IBMs senior vice president and director of research.

"As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems".

IBM did not predict when it could go commercial with quantum machines.

At the annual IBM Quantum Summit, the company also unveiled 10 projects that showed off the potential power of quantum computing, such as for drug discovery.

The scale-up Algorithmiq, which is developing quantum algorithms to solve problems in life sciences, was one of them and successfully ran one of the largest scale error mitigation experiments to date on IBMs hardware. It said the achievement positions them alongside IBM as front runners to reach quantum utility, referring to quantum computer's ability to perform reliable computations beyond the capabilities of regular computing methods, for real-world use cases.

Today represents further validation that Algorithmiqs core error mitigation techniques are powerful and will enable large-scale experiments on specific use cases leading us well into the quantum utility era for real commercial applications, said Sabrina Maniscalco, co-founder and CEO of Algorithmiq.

Ive dedicated over 20 years of my life to the study of noisy quantum systems, as a professor, and I never thought this type of experiment would be possible so soon, she said in comments to Euronews Next.

Additionally, IBM is pioneering the use of generative AI for quantum code programming IBM's enterprise AI platform watsonx.

"Generative AI and quantum computing are both reaching an inflection point, presenting us with the opportunity to use the trusted foundation model framework of watsonx to simplify how quantum algorithms can be built for utility-scale exploration," said Jay Gambetta, Vice President and IBM Fellow at IBM.

"This is a significant step towards broadening how quantum computing can be accessed and put in the hands of users as an instrument for scientific exploration".

Go here to see the original:
Has IBM cracked the code of quantum computing by solving data errors? - Euronews

Quantum Market, Though Small, will Grow 22% and Hit $1.5B in 2026 – HPCwire

Few markets as small as the quantum information sciences market generate as much lively discussion. Hyperion Research pegged the worldwide quantum market at $848 million for 2023 and expects it to reach ~$1.5 billion in 2026, according to its annual quantum computing (QC) market update presented at the Q2B Silicon Valley conference held in Santa Clara this week.

Bob Sorensen, Hyperion Researchs chief quantum analyst who presented the market update, told HPCwire, I think that a positive, if not robust, market projection is justified.The QC ecosystem is becoming more sophisticated and granular with increased opportunities from QC processor suppliers, targeted classical control system vendors, QC systems integrators, software orchestration firms, and a growing base of sector-specific QC applications developers. All that adds up to a more finely-tuned QC solution well suited to the particular requirements for any potential QC end user, making quantum computing a more attractive compute option going forward.

It is sobering that there are so many uncertainties remaining in QC writ large, ranging from figuring out what will be the quantum transistor (e.g. preferred qubit modality), to implementing needed error correction and scaling up system size, and ultimately building a library of quantum algorithms and applications to fulfill quantum computings tantalizing promise.

Whats not uncertain is the global race among quantum believers, including governments, companies, and academia all chasing the goal. For example, the U.S. is expected to reauthorize the National Quantum Initiative Act for a second five years sometime this month. Consider the major international organizations that assisted Hyperion in conducting its most recent QC market survey:

Having missed out on the semiconductor revolution the underpinning of the modern electronics industry many regions (small and large) are jumping in so as not to miss the quantum revolution. For the moment, the quantum computing ecosystem retains its roughly bi-modal nature, with a few giants and very many smaller companies jostling for sway.

As shown below, the make-up of Hyperion survey is a broad reflection on the QC market. Twenty-four respondent companies had total (not just quantum) revenues of more than $10 billion and 39 had less than $15 million. Only two companies reported more than $50 million in quantum revenue. The long (irregular) tail of 66 companies with under $1 million is more broadly representative of the aspiring QC market.

A relative newcomer to the Hyperion outlook is a more bullish attitude towards deployment of on-premise quantum systems. Both IBM and D-Wave have deployed their systems at user facilities in the past, but no others. Just this year, both QuEra (neutral atom-based qubits) and IonQ (trapped ion qubits) have announced plans to offer on-premise systems, and HPCwire has talked with at least one quantum industry veteran whos planning a quantum integrator business model to assist in deploying and integrating quantum systems into datacenters.

Sorensen said, The positive future of QC installations on-premises is clear, at least to me.Despite many of the current advantages to QC access via cloud (pay as you go options, the ability to switch qubit modalities and vendors easily, andthe relatively low capex requirements during the exploratory phase) there will be an increasing interest by QC end users firms that will have any number of reasons to use an on-prem QC, including the need to protect proprietary information, speed tightly integrated hybrid quantum/classical algorithms, ensure24/7 access to a specific machine, and likely in cases where QC usage is high, secure a lower costset-up than a cloud access alternative.

In addition, many HPC sites are and will be looking to bolster in-house QC expertise and having a system on site offers more opportunity to do that versus a cloud-based option. That said, issues to be ironed out include buy versus lease, especially at a time when hardware advances are happening quickly, decisions about which quantum modality, architecture, and vendor to commit to, and the ability to effectively integrate an on-premises QCinto an existing classical HPC ecosystem, he said.

In keeping with past studies, the top targeted sectors remain steady, although the FS sector dropped from the top spot. Prospective QC end-user attitudes about demand drivers are interesting in that they reflect, for example, the growing recognition that the traditional HPC hardware paradigm is stuck. All netted out, QC user budget expectations are also up.

On balance, Sorensens view of QC prospects is positive.

The QC sector currently is marked by a wide range of innovation with many questions about which quantum hardware and software will eventually reign supreme, he said.However, a sure sign of viable technology, especially one that could drastically redefine something as far reachingand entrenched as the classical IT sector, is that exploration is taking place across the academic, government, and the vast array of commercial entities.

All this does is ensure that every considered quantum option will have its opportunity to shine, but only if it can prove its merits. There will be a range of companies that enter the market, with some departing, some being consolidated, or some pivoting to new opportunities. But as long as the overall scope of innovation stayson an upward trajectory, future prospects for the QC sector are good.

A new consideration is the emergence of LLMs and concern regarding what impact it will have on efforts and funds flowing into the quantum ecosystem. At the moment, the quantum community doesnt seem overly worried. It should also be noted that there are many efforts to harness LLMs as education tools for quantum computing as well as as coding aids to enable developers to write code for quantum computers without having to master quantum specific tools. Jay Gambetta, VP IBM Quantum, told HPCwire recently, We [think] the full power of using quantum computing will be powered by generative AI to simplify the developer experience.

As with all things quantum computing, a measure of caution is smart Hyperion, for example, couched its outlook as estimates rather than firm forecasts. There are still a lot moving pieces in the gradually coalescing quantum landscape puzzle.

Here is the original post:
Quantum Market, Though Small, will Grow 22% and Hit $1.5B in 2026 - HPCwire

EU declares aim to become ‘quantum valley’ of the world – TNW

Q-day (the day when quantum computers will successfully actually break the internet) may be some time away yet. However, that does not mean that companies and states shouldnt hop on the qubit bandwagon now so as not to be left behind in the race for a technology that could potentially alter how we think about life, the Universe, and well everything.

Spurred on by a discourse that more and more revolves around the concept of digital sovereignty, 11 EU member states this week signed the European Declaration on Quantum Technologies.

The signatories have agreed to align, coordinate, engage, support, monitor, and all those other international collaboration verbs, on various parts of the budding quantum technology ecosystem. They include France, Belgium, Croatia, Greece, Finland, Slovakia, Slovenia, Czech Republic, Malta, Estonia, and Spain. However, the coalition is still missing some quantum frontrunners, such as the Netherlands, Ireland, and Germany, who reportedly opted out due to the short time frame.

Quantum computing, simulation, communication, and sensing and metrology, are all emerging fields of global strategic importance that will bring about a change of paradigm in technological capacities, the declaration begins.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

It further states that the blocs innovators and industry have not yet sufficiently mobilised to take full advantage of this potential as much as in other regions of the world. As such, it stresses the importance of building domestic R&D capacities for quantum technologies, as well as producing devices and systems based on them.

In addition, it needs to invest in the whole quantum stack from hardware to software and applications and standards, so as to safeguard strategic assets, interests, autonomy, and security.

The ultimate aim is to create a globally competitive ecosystem that can support a wide range of scientific and industrial applications, identify the industrial sectors where quantum technologies will have high economic and societal impact, and foster quantum innovation in small and large companies alike, from promising startups and scaleups to major industrial players in short, to become the quantum valley of the world, the declaration reads.

Thierry Breton, whose time as Commissioner for the Internal Market has been marked by a bigtech regulation crusade, has declared quantum one of his favourite subjects. We can expect to see even more of a push towards greater collaboration across the bloc, should he land the top job of Commission President next year.

Potentially, Breton could get more member states on board to coordinate on a more detailed bloc-wide quantum strategy. With quantum engineering talent notoriously difficult to come by, this could indeed be key to keeping Europe from getting left behind in yet another key technology race.

Follow this link:
EU declares aim to become 'quantum valley' of the world - TNW

Getting Ready For The Quantum Computing Era: Thoughts On Hybrid Cryptography – SemiEngineering

Using a classical cryptographic algorithm alongside its quantum safe equivalent.

Once quantum computers, more specifically Cryptographically Relevant Quantum Computers (CRQCs), have become powerful and reliable enough, they will enable adversaries to break current asymmetric encryption, placing important data and assets at risk. New digital signatures and key encapsulation mechanisms (KEMs) are needed, and while considerable progress has been made in recent years to develop new quantum-resistant algorithms, there is still ongoing discussions in the industry about the best way to implement them in the various security protocols that the industry requires.

The concept of hybrid cryptography is to use two or more fundamentally different algorithms that offer similar cryptographic functionality. In the context of Quantum Safe Cryptography more specifically, it refers to using a combination of classical cryptographic algorithms, for example, X25519 elliptic curve key exchange or ECDSA, in combination with Quantum Safe equivalents such as ML-KEM / FIPS 203 and ML-DSA / FIPS 204.

Hybrid cryptography comes in two flavors, which are sometimes referred to as AND hybrid and OR hybrid. The latter, as the name suggests, means that both algorithms are supported, and protocols can choose which of the two algorithms they prefer. This minimizes performance impact and is important to ensure mission continuity during the transition to Quantum Safe algorithms in heterogenous systems where not all components can transition at the same time.

On the other hand, it also means that communications protected only by classical ECC / RSA cryptography are vulnerable to CRQCs, and communications protected by Quantum Safe algorithms suffer from the much newer, less tested code base for these algorithms. On top of that, OR hybrid applications need to be designed specifically to prevent downgrade attacks. OR hybrid is more often simply subsumed within crypto agility discussions.

More often, when people talk about hybrid cryptography in the context of Quantum Safe algorithms, they refer to the AND hybrid model where both a classical and a Quantum Safe algorithm are combined to ensure security even if one of the algorithms or its implementation are broken. In the case of a key exchange, for example, this means that the session key will be derived in equal parts from a classical method such as X25119 and a Quantum Safe algorithm such as ML-KEM / FIPS 203. One example of this can be found in the provision of NIST SP800-56C Rev 2 that allows concatenation of two session secrets into a combined session secret from which the session key is derived. Also, there are various RFC proposals such as, for example, draft-tls-westerbaan-xyber768d00-0314 that are actively being worked on to support AND hybrid key exchanges for use in TLS. In terms of signatures, an AND hybrid scheme would only return valid if both classical and Quantum Safe signatures are successfully verified.

The Rambus Quantum Safe IP Portfolio allows for the implementation of hybrid cryptography. The Rambus QSE-IP-86 Quantum Safe Engine is a standalone cryptographic core that supports the NIST draft standards FIPS 203 ML-KEM and FIPS 204 ML-DSA and provides SHAKE-128 and SHAKE-256 acceleration. It can be combined with an accelerator for traditional asymmetric cryptography such as the Rambus PKE-IP-85 core that accelerates classic public key cryptography and a TRNG-IP-76 core that generates true random numbers. The Rambus RT-600 family of Root of Trust cores provides a robust integrated solution embedding engines and firmware that support both the full suite of CNSA 1.0 classic and CNSA 2.0 Quantum Safe algorithms (including NIST SP 800-208 XMSS/LMS hash-based verification) that can be used to implement AND hybrid solutions, offering system security management for use cases like secure boot, secure debug, secure firmware upgrade, lifecycle and SKU management, platform attestation and authentication.

Join me for my webinar Protecting Devices and Data in the Quantum Era on January 10, 2024 to learn about all the latest developments in Quantum Safe Cryptography and how you can protect your past, current, and future data in the quantum computing era.

Additional resources

Follow this link:
Getting Ready For The Quantum Computing Era: Thoughts On Hybrid Cryptography - SemiEngineering

IBM quantum roadmap targets inflection point by 2029 – CoinGeek

After reaching impressive milestones in 2023, technology giant IBM (NASDAQ: IBM)has announced its quantum computing roadmap with plans to increase the capabilities of its systems tenfold.

In ablog post, IBM says it will adopt a 10-year plan, underscored by rapid quantum innovation, to realize its mission for practical use cases for the emerging technology. With a target for 2033, the road map unveils multiple generations of processors, with each offering building on the technical achievements of others.

The roadmap comes on the heels of the launch of the IBM Condor, a 1,121 qubit quantum processor, leveraging IBMs proprietary cross-resonance gate technology. IBM Condors release has been described as an innovation milestone as it marks the first time IBM has broken the 1000-qubit barrier.

IBM says it will proceed with the mainstream rollout of Heron, its highest performing quantum processor that will be the foundation of the hardware roadmap over the decade.

The roadmap lists several processors to be rolled out in the coming years, targeting 2029 as an inflection point in its quantum computing ambitions. IBM predicts a watershed moment in 2029 via its Starling process, which can execute 100 million gates, a huge gap from Herons 5,000 gates.

By the end of the 10-year roadmap, IBM says it will be able to execute 1 billion gates, a nine-order-of-magnitude increase since rolling out its first device back in 2016.

Then, in 2029, we hit an inflection point: executing 100 million gates over 200 qubits with our Starling processor employing error correction based on the novel Gross code, read the blog post. This is followed by Blue Jay, a system capable of executing 1 billion gates across 2,000 qubits by 2033.

Rather than focusing all its efforts on innovation, IBM says it will update its offering for utility, providing users with a Qiskit Runtime service to power experiments. The company confirmed a similar upgrade for its IBM Quantum Safe and an integration with watsonx for generative AI to push the frontiers for adoption.

Entering the era of utility opens up new opportunities for enterprises to engage with quantum computing and explore workforce integration, said IBM. We are expanding our enterprise offerings to continue to advance industry use cases for utility-scale quantum computing.

A worrying trend for the US and China

Despite taking the lead in quantum computing and otheremerging technologies, pundits have pointed to a growing innovation trend outside the U.S. and China in other emerging jurisdictions. In late November, IBMinstalledthe first utility-scale quantum system outside North America at the University of Tokyo, Japan.

China faces a dilemma after Alibabashut down its quantum computing unit to focus on AI, putting a dent in its plans to become an industry leader.

Experts say the chip embargo placed on China by the U.S. contributes to the shuttering of Alibabas (NASDAQ: BABA) quantum research arm, with the company pledging to donate its lab equipment to Zhejiang University.

Watch: Konstantinos Sgantzos talks AI and BSV blockchain with CoinGeek

New to blockchain? Check out CoinGeeks Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.

See original here:
IBM quantum roadmap targets inflection point by 2029 - CoinGeek

Scientists created the first programmable, logical quantum processor – Tech Explorist

The primary challenge for practical quantum computing is error suppression, necessitating quantum error correction for extensive processing. However, implementing error-corrected logical qubits, where information is redundantly encoded across multiple physical qubits, presents significant challenges for achieving large-scale logical quantum computing.

A new study by Harvard scientists reports realizing a programmable quantum processor based on encoded logical qubits operating with up to 280 physical qubits. This is a critical milestone in the quest for stable, scalable quantum computing.

This new quantum processor can encode up to 48 logical qubits and execute hundreds of logical gate operations, a vast improvement over prior efforts. This system marks the initial showcase of running large-scale algorithms on an error-corrected quantum computer, signaling the arrival of early fault-tolerant quantum computation that operates reliably without interruption.

Denise Caldwell of the National Science Foundation said,This breakthrough is a tour de force of quantum engineering and design. The team has not only accelerated the development of quantum information processing by using neutral atoms but opened a new door to explorations of large-scale logical qubit devices, which could enable transformative benefits for science and society as a whole.

A quantum bit or qubit is one unit of information in quantum computing. In the world of quantum computing, in principle, it is possible to create physical qubits by manipulating quantum particles be they atoms, ions, or photons.

Harnessing the peculiarities of quantum mechanics for computation is more intricate than merely accumulating a sufficient number of qubits. Qubits are inherently unstable and susceptible to collapsing out of their quantum states.

The accurate measure of success lies in logical qubits, known as the coins of the realm. These are bundles of redundant, error-corrected physical qubits capable of storing information for quantum algorithms. Creating controllable logical qubits, akin to classical bits poses a significant challenge for the field. It is widely acknowledged that until quantum computers can operate reliably on logical qubits, the technology cannot truly advance.

Current computing systems have demonstrated only one or two logical qubits and a single quantum gate operationa unit of codebetween them.

The breakthrough by the Harvard team is built upon years of research on a quantum computing architecture called a neutral atom array, pioneered in Lukins lab. QuEra, a company commercializing this technology, recently entered into a licensing agreement with Harvards Office of Technology Development for a patent portfolio based on Lukins groups innovations.

A block of ultra-cold, suspended rubidium atoms is at the heart of the system. These atoms, serving as the systems physical qubits, can move around and form pairs or become entangled during computations.

Entangled pairs of atoms come together to form gates, representing units of computing power. The team had previously showcased low error rates in their entangling operations, establishing the reliability of their neutral atom array system.

In their logical quantum processor, the scientists have now demonstrated parallel, multiplexed control over an entire section of logical qubits using lasers. This approach is more efficient and scalable compared to individually controlling physical qubits.

Paper first author Dolev Bluvstein, a Griffin School of Arts and Sciences Ph.D. student in Lukins lab, said,We are trying to mark a transition in the field, toward starting to test algorithms with error-corrected qubits instead of physical ones, and enabling a path toward larger devices.

Journal Reference:

See the article here:
Scientists created the first programmable, logical quantum processor - Tech Explorist

IBM unveils next-gen 133-qubit Heron quantum processor and its first modular quantum computer – SiliconANGLE News

IBM Corp. today announced the launch of its newest quantum processor Heron, featuring 133 qubits of computing power that will serve as the foundation for a new series of processors capable of providing practical utility for science and research.

Alongside the new processor, the IBM unveiled the Quantum System Two, the companys first modular quantum computer powered by Heron, during Quantum Summit 2023, the companys annual quantum computing conference.

The technology giant also announced Condor, a 1,121-qubit processor that is part of IBMs focus on long-term research into developing large-scale quantum computing efforts. In a press briefing, Mattias Stephan, chief quantum architect and IBM fellow, said the device packed 50% more qubit density, with over a mile of flex cabling. The efforts in building the device, he said unlocked the road to scaling.

Although the processor has a massive number of qubits, Stephan said it has comparable performance to the433-qubit Osprey devicedebut in 2022. This is because simply stacking qubits doesnt make a processor faster or more powerful, architectural changes are needed. According to Stephan, what IBM learned from Condor, and its previous Eagle quantum processor, paved the way for the tunable architecture breakthrough of the Heron processor.

Heron is our best-performing quantum processor to date with up to a five-fold improvement in error reduction compared to our flagship Eagle device, said Stephan. This was a journey that was four years in the making. It was designed for modularity and scale.

In 2021, IBM debuted theEagle quantum processorfeaturing 127 qubits, becoming the first processor to break 100 qubits. Earlier this year, the companydemonstratedthat quantum processors can serve as the foundation for tools to provide as practical utility platforms for scientific research to solve problems for chemistry, physics and materials problems beyond brute force classical simulation of quantum mechanics. This opened up a variety of new use cases for researchers.

Since that demonstration, researchers and scientists at numerous organizations including the U.S. Department of Energy, the University of Tokyo, Q-CTRL and the University of Cologne have expanded their use of quantum computing to solve bigger and harder real-world problems such as drug discovery and tuning materials science.

We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science, said Dario Gil, IBM senior vice president and director of research. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack.

The IBM Quantum System Two will become the foundation for IBMs next-generation quantum computing system architecture, powered by three Heron quantum processors. As a unit, it combines a scalable cryogenic refrigeration infrastructure and classical servers with modular qubit control electronics. As a result, it will be able to expand to relate to future needs and IBM plans to use the system to house future generations of quantum processors.

The first Quantum System Two is housed in a facility in Yorktown Heights, New York.

To assist with enabling the use of quantum computing for developers, IBM announced thatQiskitwill hit version 1.0 in February. Qiskit is an open-source software development toolkit for quantum that includes tools for writing and manipulating quantum programs and running them on the IBM Quantum Platform or a simulator.

Aimed at making it easier for developers and engineers to work with quantum computing, IBM announced Qiskit Patterns, a way to allow quantum developers to easily create code. It is a set of tools that will allow them to map classical problems, optimize quantum circuits using Qiskit Runtime and then process results.

With Qiskit Patterns and Quantum Serverless you can build, deploy, run, and in the future, share for other users to use, said Jay Gambetta, vice president of IBM Quantum.

Additionally, in a demonstration, Gambetta revealed that quantum developers will be able to use generative artificial intelligence powered by Watson X to make quantum circuits. Using this tool, a user would only need to write out a description of the quantum problem that they want to solve, and a foundation model named Granite, trained with Qiskit data, would do the heavy lifting for them.

We really see the full power of generative AI to simplify the developer experience, said Gambetta.

THANK YOU

Read more:
IBM unveils next-gen 133-qubit Heron quantum processor and its first modular quantum computer - SiliconANGLE News

IBM Is Planning to Build Its First Fault-Tolerant Quantum Computer by 2029 – Singularity Hub

This week, IBM announced a pair of shiny new quantum computers.

The companys Condor processor is the first quantum chip of its kind with over 1,000 qubits, a feat that would have made big headlines just a few years ago. But earlier this year, a startup, Atom Computing, unveiled a 1,180-qubit quantum computer using a different approach. And although IBM says Condor demonstrates it can reliably produce high-quality qubits at scale, itll likely be the largest single chip the company makes until sometime next decade.

Instead of growing the number of qubits crammed onto each chip, IBM will focus on getting the most out of the qubits it has. In this respect, the second chip announced, Heron, is the future.

Though Heron has fewer qubits than Condorjust 133its significantly faster and less error-prone. The company plans to combine several of these smaller chips into increasingly more powerful systems, a bit like the multicore processors powering smartphones. The first of these, System Two, also announced this week, contains three linked Condor chips.

IBM also updated its quantum roadmap, a timeline of key engineering milestones, through 2033. Notably, the company is aiming to complete a fault-tolerant quantum computer by 2029. The machine wont be large enough to run complex quantum algorithms, like the one expected to one day break standard encryption. Still, its a bold promise.

Practical quantum computers will be able to tackle problems that cant be solved using classical computers. But todays systems are far too small and error-ridden to realize that dream. To get there, engineers are working on a solution called error-correction.

A qubit is the fundamental unit of a quantum computer. In your laptop, the basic unit of information is a 1 or 0 represented by a transistor thats either on or off. In a quantum computer, the unit of information is 1, 0, orthanks to quantum weirdnesssome combination of the two. The physical component can be an atom, electron, or tiny superconducting loop of wire.

Opting for the latter, IBM makes its quantum computers by cooling loops of wire, or transmons, to temperatures near absolute zero and placing them into quantum states. Heres the problem. Qubits are incredibly fragile, easily falling out of these quantum states throughout a calculation. This introduces errors that make todays machines unreliable.

One way to solve this problem is to minimize errors. IBMs made progress here. Heron uses some new hardware to significantly speed up how quickly the system places pairs of qubits into quantum statesan operation known as a gatelimiting the number of errors that crop up and spread to neighboring qubits (researchers call this crosstalk).

Its a beautiful device, Gambetta told Ars Technica. Its five times better than the previous devices, the errors are way less, [and] crosstalk cant really be measured.

But you cant totally eliminate errors. In the future, redundancy will also be key.

By spreading information between a group of qubits, you can reduce the impact of any one error and also check for and correct errors in the group. Because it takes multiple physical qubits to form one of these error-corrected logical qubits, you need an awful lot of them to complete useful calculations. This is why scale matters.

Software can also help. IBM is already employing a technique called error mitigation, announced earlier this year, in which it simulates likely errors and subtracts them from calculations. Theyve also identified a method of error-correction that reduces the number of physical qubits in a logical qubit by nearly an order of magnitude. But all this will require advanced forms of connectivity between qubits, which could be the biggest challenge ahead.

Youre going to have to tie them together, Dario Gil, senior vice president and director of research at IBM, told Reuters. Youre going to have to do many of these things together to be practical about it. Because if not, its just a paper exercise.

Something that makes IBM unique in the industry is that it publishes a roadmap looking a decade into the future.

This may seem risky, but to date, theyve stuck to it. Alongside the Condor and Heron news, IBM also posted an updated version of its roadmap.

Next year, theyll release an upgraded version of Heron capable of 5,000 gate operations. After Heron comes Flamingo. Theyll link seven of these Flamingo chips into a single system with over 1,000 qubits. They also plan to grow Flamingos gate count by roughly 50 percent a year until it hits 15,000 in 2028. In parallel, the company will work on error-correction, beginning with memory, then moving on to communication and gates.

All this will culminate in a 200-qubit, fault-tolerant chip called Starling in 2029 and a leap in gate operations to 100 million. Starling will give way to the bigger Blue Jay in 2033.

Though it may be the most open about them, IBM isnt alone in its ambitions.

Google is pursuing the same type of quantum computer and has been focused on error-correction over scaling for a few years. Then there are other kinds of quantum computers entirelysome use charged ions as qubits while others use photons, electrons, or like Atom Computing, neutral atoms. Each approach has its tradeoffs.

When it comes down to it, theres a simple set of metrics for you to compare the performance of the quantum processors, Jerry Chow, director of quantum systems at IBM, told the Verge. Its scale: what number of qubits can you get to and build reliably? Quality: how long do those qubits live for you to perform operations and calculations on? And speed: how quickly can you actually run executions and problems through these quantum processors?

Atom Computing favors neutral atoms because theyre identicaleliminating the possibility of manufacturing flawscan be controlled wirelessly, and operate at room temperature. Chow agrees there are interesting things happening in the nuetral atom space but speed is a drawback. It comes down to that speed, he said. Anytime you have these actual atomic items, either an ion or an atom, your clock rates end up hurting you.

The truth is the race isnt yet won, and wont be for awhile yet. New advances or unforeseen challenges could rework the landscape. But Chow said the companys confidence in its approach is what allows them to look ahead 10 years.

And to me its more that there are going to be innovations within that are going to continue to compound over those 10 years, that might make it even more attractive as time goes on. And thats just the nature of technology, he said.

Image Credit: IBM

Originally posted here:
IBM Is Planning to Build Its First Fault-Tolerant Quantum Computer by 2029 - Singularity Hub

Frontiers in Quantum Computing: 3 Stocks Leading the Way – InvestorPlace

You can't miss out on these quantum computing picks

Quantum computing stocks should be on your radar. The vast potential of quantum technologies means well likely witness dramatic progress in AI, IoT, and clean energy technologies. These computers will give us the needed horsepower, but the tech is presently under a competitive research and development environment.

Regardless of the speculative nature of quantum computing stocks, we can already observe leaders. These companies are heading the pack in pioneering this new standard for the computing industry.

So, to know the three quantum computing stocks leading us forward, lets explore your best options.

Source: shutterstock.com/LCV

IBM (NYSE:IBM) warrants attention foremost.

Most recently, the company installed a 127-qubit quantum processor in its IBM Quantum System One machine at the University of Tokyo, Japan.

This significant development is not only one of the first quantum computers in East Asia, but also it challenges other regions for market dominance. Typically led by Europe and North America, this sets the stage for Asia to emerge as a pivotal player. And this may have critical competitive considerations for companies like IBM.

IBMs processor is expected to conduct high-level research in various fields ranging from finance to medicine to modeling complex biological processes.

Besides this recent development that should give quantum bulls a reason to smile, IBM is also undervalued on several key metrics. It effectively balances strong cash generation with a dividend yield of 4.14% and a price/earnings-to-growth (PEG) ratio of 0.43.

Source: IgorGolovniov / Shutterstock.com

Alphabets (NASDAQ:GOOG, NASDAQ:GOOGL) position in the quantum computing market is also formidable. The company made significant headway in February by reporting that it reduced computational errors in its quantum bits. Reducing these errors is crucial to making quantum computers usable and a key barrier to commercialization.

Complementing Alphabets goal of commercializing its quantum system this year is its impressive financials. Like IBM, its PEG ratio is 1.26, indicating expected growth at a reasonable price. Furthermore, it has retained robust top and bottom lines with a revenue of $297.13 billion and a net income of $66.73 billion.

Also, Wall Streets stance on Alphabet remains bullish. It carries a strong buy recommendation. Further, analysts predict an average 12-month price increase of 7.32%, with a high target of $180.

Source: Ascannio / Shutterstock.com

Microsoft (NASDAQ:MSFT) is building an ecosystem to support its quantum computing services with its Q# development suite. Also, it onboards developers early to test its code and tools.

Therefore, the development of MSFTs community is one of the key reasons to be bullish on MSFT. Q# is striving to become the de facto standard. In fact, its similar to the way certain programming languages once fought for dominance amongst the development community. Today, we are left with a handful of the most popular.

Further, MSFT is taking a calculated gamble on its development of quantum technology. Its investing heavily in research and developing novel ways to improve error correction and fault tolerance. This approach is riskier, but if it pays off. It could give MSFT one of the most stable quantum computing systems on the market upon release, if not the most stable, thus giving it a significant advantage over its peers.

On the date of publication, Matthew Farley did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed are those of the writer, subject to theInvestorPlace.com Publishing Guidelines.

Matthew started writing coverage of the financial markets during the crypto boom of 2017 and was also a team member of several fintech startups. He then started writing about Australian and U.S. equities for various publications. His work has appeared in MarketBeat, FXStreet, Cryptoslate, Seeking Alpha, and the New Scientist magazine, among others.

More here:
Frontiers in Quantum Computing: 3 Stocks Leading the Way - InvestorPlace

5 minutes with: Dr. Juan Bernabe Moreno, IBM – Technology Magazine

How do you see AI and new technologies accelerating sustainability? and how it can accelerate sustainability as well.

We have very tangible examples of when we talk about sustainability. At least speaking for myself, we struggle in terms of understanding what sustainability is and how we can make it actionable. How can we track if some promises are kept? Can we measure what the Kenyan government is doing in terms of reforestation over time, for example?

The geospatial foundation model we have created [at IBM] is helping us quantify climate mitigating actions like reforestation, but also helping us understand how particular measures like putting up a fence can help. Its very encouraging because, not only can you see masses of trees growing, you can also quantify how many gigatons of carbon you can capture over the years.

So you make it tangible. That's probably one of my favourite aspects of what technology can do for sustainability.

As a computer scientist, there are very rare moments where you see history happening. In quantum this year, we have managed to achieve one which we call quantum utility. We have entered the quantum utility era.

Quantum utility is when you take a problem, and this case it was a small magnetisation problem, and we tasked one of our partners, the University of Berkeley to do their best classically, and we have taken the same problem. We map it to a quantum computer with our hardware today and we apply some error mitigation routines that we have created on top of our stack. These error mitigation routines are now available to everyone.

We were then in a position of showing better performance than the classic. So for the first time, we see for real, quantum utility beating classic in this particular experiment.

When we talk about quantum, we always talk about fault tolerance - having the perfect system with no computing errors. What we are doing now is trying to find, with our partners, more and more examples of this quantum utility - much broader and bigger examples of showing that the current quantum hardware is improving. Our operation routines can get us there.

First of all, how can we change our approach to build the hardware? Because we saw it classically, right? We started with bigger and bigger and bigger and bigger machines until we discovered that we needed to go modular.

What we are doing now is working on modularity for quantum processing - but modularity means that you need to establish the connectivity between the units. So we first started looking at classical links, but in the future we will also see quantum communications happening between the units, which is quite challenging. There's a bit of research behind it, from the hardware perspective, that's probably one of my personal highlights.

Another highlight probably is that I hope that we announce that we keep firmly implementing every milestone that we set ourselves in our roadmap.

You will see many companies working with [IBM] and many partners presenting quantum utility experiments already. That's going to be very refreshing - it's going to create a lot of momentum when more and more people see that. In this particular case, quantum: it's classic. So that's going to create a good vibe in the quantum community.

There is so much happening at the same time and at such speed.

******

For more insights into the world of Technology - check out the latest edition of Technology Magazine and be sure to follow us on LinkedIn & Twitter.

Other magazines that may be of interest - AI Magazine | Cyber Magazine | Data Centre Magazine

Please also check out our upcoming event - Sustainability LIVE Net Zero on 6 and 7 March 2024.

******

BizClik is a global provider of B2B digital media platforms that covers executive communities for CEOs, CFOs, CMOs, sustainability leaders, procurement & supply chain leaders, technology & AI leaders, fintech leaders as well as covering industries such as manufacturing, mining, energy, EV, construction, healthcare and food.

Based in London, Dubai, and New York, Bizclik offers services such as content creation, advertising & sponsorship solutions, webinars & events.

See the original post here:
5 minutes with: Dr. Juan Bernabe Moreno, IBM - Technology Magazine

IBM says it will have hit a quantum computing ‘inflection point’ by 2029 – Cointelegraph

IBM announced the unveiling of its 1,121-qubit Condor quantum computing processor on Dec. 4. This is the companys largest by qubit count and, arguably, the worlds most advanced gate-based, superconducting quantum system.

Alongside the new chip, IBM delivered an updated roadmap and a trove of information on the companys planned endeavors in the quantum computing space.

The 1,121-qubit processor represents the apex of IBMs previous roadmap. Its preceded by 2022s 433-qubit Osprey processor and by 2021s 127-qubit Eagle processor.

In quantum computing terms, qubit count isnt necessarily a measure of power or capability so much as it is potential. While more qubits should theoretically lead to more capable systems eventually, the industrys current focus is on error correction and fault tolerance.

Currently, IBM considers its experiments with 100-qubit systems to be the status quo, with much of the current work focused on increasing the number of quantum gates processors can function with.

For the first time, writes IBM fellow and vice president of quantum computing Jay Gambetta in a recent blog post, we have hardware and software capable of executing quantum circuits with no known a priori answer at a scale of 100 qubits and 3,000 gates.

Gates, like qubits, are a potential measure of the usefulness of a quantum system. The more gates a processor can implement, the more complex functions can be performed by the system. According to IBM, at the 3,000 gates scale, its 100-qubit quantum systems are now computational tools.

The next major inflection point, per the blog post, will occur in 2029 when IBM will execute 100 million gates over 200 qubits with a processor its calling Starling.

This is followed, writes Gambetta, by Blue Jay, a system capable of executing 1 billion gates across 2,000 qubits by 2033.

Related: IBM brings utility-scale quantum computing to Japan as China and Europe struggle to compete

See the original post:
IBM says it will have hit a quantum computing 'inflection point' by 2029 - Cointelegraph

The Threat of Quantum Computing – Solutions Review

Solutions ReviewsContributed Content Series is a collection of contributed articles written by thought leaders in enterprise software categories. Zibby Kwecka of Quorum Cyber examines the current and future states of quantum computing, and the inevitable threat of a quantum attack.

The threat of quantum computing is very real, today. As of July 2022, 25 percent of Bitcoin and 66 percent of Ether are vulnerable to quantum attacks (Deloitte, 2023). These can be secured with action, however, even if a small number of these currencies are stolen, the market disruption may significantly devalue assets.Quantum computers have the potential to solve certain complex mathematical problems significantly faster than classical computers. One of the most notable implications is their ability to break encryption algorithms that rely on the difficulty of factoring large numbers or solving logarithm problems. There are theoretical methods to crack our current encryption methods that would be possible on a conventional computer, however widely inefficient. Quantum will allow the cracking of keys thousands of times more efficiently, making it possible to break todays encryption in just a few cycles. Thankfully, for now, scale remains a problem for quantum computing.

Once quantum computers become a tool thats commercially available and matured, its expected attackers will take advantage of this to break current encryption methods, creating a significant risk to the security of our sensitive data. Using this technology as a platform for an attack is a concern for organizations, not just on the cryptography front.The threat of quantum computing becoming part of an actors offensive toolbox is likely. Taking advantage of decryption techniques, forging certificates, or its potential ability of rapid machine learning, could vastly speed up network recon and eavesdropping, and forging identities.

Just because quantum computing isnt here yet doesnt mean we shouldnt be aware of the risk. Data may already have been stolen, or harvested, for later yield. While it may not be currently feasible to decrypt your data yet, once it becomes a viable and affordable measure through quantum computing, harvested data and communication traffic could be decrypted. This may be assisted by projects from Microsoft and IBM aiming to offer cloud-based multi-quantum computing facilities on a consumption model.

The National Institute of Standards and Technology (NIST) has been calling for the development of encryption methods that would remain resistant to the advantages of quantum computing, with the first four quantum-resistant cryptographic algorithms announced back in 2022 (NIST, 2022). There is a future of using quantum computers to vastly improve our digital security, but theres a risk of being in a very dangerous limbo between the threats posed and the future of greater security. Currently, there are several limitations preventing development at scale, which may take years to overcome.

The most likely quantum attack would involve breaking cryptographic systems of communication methods we use today. This isnt just a future problem; however, its happening already. The widely known Harvest Now, Decrypt Later operations store stolen information that will later be decrypted using advanced technology. This might be years away, but depending on the sensitive information, it could still enable extortion against organizations or individuals. Its a compelling argument to encourage businesses to purge old data thats no longer required.

Future cyber-attacks will involve hybrid approaches that combine classical and quantum computing techniques. Quantum computers are great at operating in parallel states, and thus, it would be natural to apply them to fuzzing systems and finding vulnerabilities. The added fuzzing ability of quantum computers could drastically speed up attacks aiming to penetrate a system. Fuzzing tests programs by using numerous randomized inputs, and could be a perfect use for quantum machines.

Current RSA encryption relies on 2048-bit numbers. In 2019, quantum computers were only able to factor a 6-bit number. In 2022, that number only increased to 48-bits under a highly specialized environment (Swayne, 2022). There is the expectation within the next 10 years we could be at a point where current encryption methods are at risk. The current development is exponential (Deloitte, 2023).A recent mandate from the US Congress declares a 2035 deadline for quantum-resistant cryptography to be implemented (Executive Office of The President, 2022), but it could be sooner.

The exponential development of artificial intelligence (AI) underway may, at some stage, support scientists in solving some of the challenges currently faced. For a quantum computer to undertake a task the problem statement must be translated into a format a quantum computer can actually work with first. This is a laborious task, and hence apart from the high cost of entry to the quantum computing attacks because of the hardware costs, there is an even higher ongoing cost associated with translating targeted problem statements into something that can be tested. This is why cryptographic use cases are currently prevalent when quantum is discussed. They are repetitive, as we only use a handful of cryptographic algorithms to secure the digital world. However, AI will one day enable us to rapidly create translations of human-readable problem statements, and software to be tested, into the code that can be processed by a quantum computer, and this is when the full capabilities of this technology will be reached.

There are several actions that should be considered:

To start using quantum machines to solve real-world problems, we feasibly need a machine capable of at least 1 million stable qubits (Microsoft, 2023). Currently, the qubits in existence suffer at scale for several reasons, one of which is quantum decoherence making each qubit only available for a short period of time. As far as research goes, weve only just reached over 100 qubits (Ball, 2021). Until these challenges are overcome the use of quantum computing is limited.

The rest is here:
The Threat of Quantum Computing - Solutions Review

Harvard, QuEra, MIT, and the NIST/University of Maryland Usher in New Era of Quantum Computing by Performing … – GlobeNewswire

BOSTON, Dec. 06, 2023 (GLOBE NEWSWIRE) -- QuEra Computing, the leader in neutral-atom quantum computers, today announced a significant breakthrough published in the scientific journal Nature. In experiments led by Harvard University in close collaboration with QuEra Computing, MIT, and NIST/UMD, researchers successfully executed large-scale algorithms on an error-corrected quantum computer with 48 logical qubits and hundreds of entangling logical operations. This advancement, a significant leap in quantum computing, sets the stage for developing truly scalable and fault-tolerant quantum computers that could solve practical classically intractable problems. The complete paper can be accessed on Nature at https://www.nature.com/articles/s41586-023-06927-3.

"We at Moodys Analytics recognize the monumental significance of achieving 48 logical qubits in a fault-tolerant quantum computing environment and its potential to revolutionize data analytics and financial simulations, said Sergio Gago, Managing Director of Quantum and AI at Moodys Analytics, This brings us closer to a future where quantum computing is not just an experimental endeavor but a practical tool that can deliver real-world solutions for our clients. This pivotal moment could redefine how industries approach complex computational challenges."

A critical challenge preventing quantum computing from reaching its enormous potential is the noise that affects qubits, corrupting computations before reaching the desired results. Quantum error correction overcomes these limitations by creating logical qubits," groups of physical qubits that are entangled to store information redundantly. This redundancy allows for identifying and correcting errors that may occur during quantum computations. By using logical qubits instead of individual physical qubits, quantum systems can achieve a level of fault tolerance, making them more robust and reliable for complex computations.

This is a truly exciting time in our field as the fundamental ideas of quantum error correction and fault tolerance are starting to bear fruit, said Mikhail Lukin, the Joshua and Beth Friedman University Professor, co-director of the Harvard Quantum Initiative, and co-founder of QuEra Computing. This work, leveraging the outstanding recent progress in the neutral-atom quantum computing community, is a testament to the incredible effort of exceptionally talented students and postdocs as well as our remarkable collaborators at QuEra, MIT, and NIST/UMD. Although we are clear-eyed about the challenges ahead, we expect that this new advance will greatly accelerate the progress towards large-scale, useful quantum computers, enabling the next phase of discovery and innovation.

Previous demonstrations of error correction have showcased one, two, or three logical qubits. This new work demonstrates quantum error correction in 48 logical qubits, enhancing computational stability and reliability while addressing the error problem. On the path to large-scale quantum computation, Harvard, QuEra, and the collaborators reported the following critical achievements:

The breakthrough utilized an advanced neutral-atom system quantum computer, combining hundreds of qubits, high two-qubit gate fidelities, arbitrary connectivity, fully programmable single-qubit rotations, and mid-circuit readout.

The system also included hardware-efficient control in reconfigurable neutral-atom arrays, employing direct, parallel control over an entire group of logical qubits. This parallel control dramatically reduces the control overhead and complexity of performing logical operations. While using as many as 280 physical qubits, researchers needed to program fewer than ten control signals to execute all of the required operations in the study. Other quantum modalities typically require hundreds of control signals for the same number of qubits. As quantum computers scale to many thousands of qubits, efficient control becomes critically important.

"The achievement of 48 logical qubits with high fault tolerance is a watershed moment in the quantum computing industry, said Matt Langione, Partner at the Boston Consulting Group. This breakthrough not only accelerates the timeline for practical quantum applications but also opens up new avenues for solving problems that were previously considered intractable by classical computing methods. It's a game-changer that significantly elevates the commercial viability of quantum computing. Businesses across sectors should take note, as the race to quantum advantage just got a major boost."

"Today marks a historic milestone for QuEra and the broader quantum computing community, said Alex Keesling, CEO, QuEra Computing, These achievements are the culmination of a multi-year effort, led by our Harvard and MIT academic collaborators together with QuEra scientists and engineers, to push the boundaries of what's possible in quantum computing. This isn't just a technological leap; it's a testament to the power of collaboration and investment in pioneering research. We're thrilled to set the stage for a new era of scalable, fault-tolerant quantum computing that can tackle some of the world's most complex problems. The future of quantum is here, and QuEra is proud to be at the forefront of this revolution."

Our experience in manufacturing and operating quantum computers - such as our first-generation machine available on a public cloud since 2022 - coupled with this groundbreaking research, puts us in a prime position to lead the quantum revolution, added Keesling.

The work was supported by the Defense Advanced Research Projects Agency through the Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) program, the National Science Foundation, the Center for Ultracold Atoms (an NSF Physics Frontiers Center), and the Army Research Office.

QuEra also announced a special event on Jan 9th at 11:30 AM ET, where QuEra will reveal its commercial roadmap for fault-tolerant quantum computers. Register for this online event at https://quera.link/roadmap

About QuEra QuEra Computing is the leader in commercializing quantum computers using neutral atoms, which is widely recognized as a highly promising quantum modality. Based in Boston and built on pioneering research from nearby Harvard University and MIT, QuEra operates the worlds largest publicly accessible quantum computer, available over a major public cloud and for on-premises delivery. QuEra is developing large-scale, fault-tolerant quantum computers to tackle classically intractable problems, becoming the partner of choice in the quantum field. Simply put, QuEra is the best way to quantum. For more information, visit us at quera.com and follow us on Twitter or LinkedIn.

Media Contact Merrill Freund press@quera.com +1-415-577-8637

The rest is here:
Harvard, QuEra, MIT, and the NIST/University of Maryland Usher in New Era of Quantum Computing by Performing ... - GlobeNewswire

Taking Flight with Heron and Condor: The Latest Advancements in Quantum Computers – Securities.io

IBM has just announced the latest breakthrough in its mission to make commercialized and practical quantum computers a reality a 1,000+ qubit processor dubbed Condor' and an error-correction-focused processor dubbed Heron'.

Quantum computers represent a new approach to machine-based computation. Through the use of qubits capable of superposition and entanglement, quantum computers have the potential to perform faster and more complex calculations than classical bits used in more traditional computers. Unlike traditional computing, where bits represent either 0 or 1, qubits in quantum computing can represent both states simultaneously. Importantly, this makes quantum computing complementary to classical computing rather than a replacement; it excels in tasks like molecular simulations and system optimizations, while classical computing is better suited for everyday tasks.

It is because of the types of tasks that quantum computing should excel at that the technology is so vaunted. A computer capable of performing complex calculations orders of magnitudes quicker than its traditional counterparts is worth developing, as its use cases have the potential to change the world and our understanding of it.

With its announcement, IBM has made significant strides in quantum computing by launching two advanced quantum processors: Heron and Condor.

The Heron processor, featured on the ibm_torino quantum system, represents a leap forward with its 133 fixed-frequency qubits and tunable couplers, delivering a 3-5x improvement in performance compared to its previous 127-qubit Eagle processors. This advancement virtually eliminates cross-talk' (undesired interaction or interference between qubits) and lays the groundwork for future hardware development. Notably, IBM is already utilizing these chips in its modular-architecture' Quantum System Two computing platform.

On the other hand, the Condor processor, a 1,121 superconducting qubit quantum processor, is an equally notable innovation. It increases qubit density by 50%, incorporates advancements in qubit fabrication, and integrates over a mile of high-density cryogenic wiring within a single dilution refrigerator (a tool used to achieve extremely low temperatures, typically close to absolute zero). Condor's performance is comparable to the company's earlier 433-qubit Osprey processor, marking a significant milestone in scaling and informing future hardware design in quantum computing.

These developments by IBM are pivotal in pushing the boundaries of quantum utility and advancing toward quantum-centric supercomputing.

As previously mentioned, quantum computers are so vaunted due to their potential to greatly advance our understanding of just about every field of science. The following are just a few examples of these.

Medicine: In medicine, quantum computing could revolutionize drug discovery by simulating the behavior of molecules at a quantum level. This allows for more accurate predictions of how potential drugs might interact with the human body, speeding up the development of new medications and reducing costs.

Meteorology: For meteorology, quantum computers could analyze vast amounts of weather data more efficiently than classical computers. This would lead to more accurate weather predictions and better understanding of climate change, helping to mitigate natural disasters and plan agricultural strategies.

Complex Problem Solving: Quantum computing could tackle problems that are currently unsolvable by classical computers, such as optimizing large systems for logistics and supply chains, or solving intricate mathematical problems. This has broad implications for various sectors, including transportation, energy, and finance.

It is also important to recognize that we can not know what we cannot imagine. Meaning, there will be scores of unexpected advancements made possible through the abilities one day provided by this technology.

Quantum computing is the future of computing. It will open up new possibilities for scientific discovery and technological advancement that we can't even imagine today. Arvind Krishna, Chairman and CEO of IBM, in an interview with CNBC

With quantum computers representing such a monumental technological achievement, it should come as no surprise that there have been, and remain, significant hurdles and limitations that must be overcome in time. For example, quantum computing currently faces challenges in error correction, scalability, and developing practical algorithms.

In time, there are bound to be other hurdles that pop up, which were previously unexpected due to a rudimentary but growing understanding of quantum mechanics. The complexity and potential of quantum physics was emphasized in the following quote.

If you think you understand quantum mechanics, you don't understand quantum mechanics. Richard Feynman, Nobel laureate in Physics

As it stands, these limitations mean quantum computers are not yet ready for widespread use. With recent advancements, optimistic timelines point to another decade before this is the case.

In past decades, quantum computing seemed to be in such a distant future that courses teaching it were few and far between. Now that a future in which they are actually in use is beginning to come into focus, the need to train the next generation of scientists and engineers who will be responsible for continuing this advancement is only increasing. As a result, many universities are now offering specialized courses and programs in quantum computing to prepare a skilled workforce for this emerging field.

While the aforementioned schools may be training the next generation of quantum computing specialists, the following few companies are currently paving the road to this future.

IBM has long been a leader in the development of quantum computers. The company aims to democratize quantum computing development through initiatives like Qiskit Patterns. IBM has also expanded its roadmap for achieving large-scale, practical quantum computing, focusing on new modular architectures and networking that could enable quantum systems with hundreds of thousands of qubits, essential for practical quantum applications.

Microsoft's efforts in quantum computing are centered around cloud integration and collaboration. The company has introduced quantum machines with the highest quantum volumes in the industry to Azure Quantum, including partnerships with IonQ, Pasqal, Quantinuum, QCI, and Rigetti. This integration facilitates experimentation and is a step towards scaled quantum computing. Microsoft emphasizes the importance of a global ecosystem to realize the full potential of quantum computing and plans to deliver its quantum machine as a cloud service through Azure, ensuring secure and responsible use of this emerging technology.

Alphabet, through its Google Quantum AI lab, has made significant strides in quantum computing. In 2023, Google scientists announced a major milestone in reducing the rate of errors in quantum computing, a long-standing challenge in the field. Its research, published in the journal Nature, describes a system capable of significantly decreasing the error rate and implementing error-correcting codes that can detect and fix errors without compromising the information. Previously, in 2019, Google claimed to have achieved quantum supremacy with its Sycamore machine, performing a calculation in 200 seconds that would have taken a conventional supercomputer 10,000 years, demonstrating the potential of quantum computing in solving complex problems far beyond the capabilities of traditional computing.

Quantum computing represents a groundbreaking leap in the world of computing, offering the potential to revolutionize a plethora of fields. While IBM's recent advancements with the Heron and Condor quantum processors signify significant progress toward practical quantum computing, the technology continues to face significant challenges in error correction, scalability, and algorithm development highlighting the need for continued research and innovation.

While these challenges remain, quantum computing holds the promise of unlocking possibilities we can't even imagine today, ushering in a new era of scientific discovery and technological advancement. Its full potential is still unfolding, and its impact on various industries and society promises to be profound.

Here is the original post:
Taking Flight with Heron and Condor: The Latest Advancements in Quantum Computers - Securities.io

Riverlane Partners with Infleqtion and Nv Cameras to Help Quantum Computers ‘See’ Their Qubits – AZoOptics

A new project will bring together leading UK and Canadian companies to develop the imaging systems to measure qubit states. This is a vital capability for quantum computers to scale.

Quantum computers are based on building blocks called qubits (quantum bits), but they are not yet powerful enough to unlock any real-world applications. To achieve this, the number and quality of qubits must grow, together with the optical and electronic systems needed to perform operations with qubits and read out the results.

Steve Brierley, CEO and Founder at Riverlane, said: "We need to reach the scale where quantum computers can perform roughly a trillion reliable quantum operations a threshold we call the 'TeraQuop'. Todays quantum computers are only capable of a few hundred error-free operations. This project pushes us closer to this TeraQuop goal, but we cannot do this alone and this is why collaboration with leaders likeInfleqtion andNv Camras is vital, enabling the continued, long-term growth of quantum computing."

In the Scalable Qubit Array Detection for Rydberg Quantum Computers project, quantum computing companies Infleqtion and Riverlane will collaborate with imaging systems specialists Nv Camras to develop systems to greatly improve the readout of the status of the qubits.

The partnership between Infleqtion,Nv Camrasand Riverlane will allow for collaborative development in this area of the quantum computing supply chain, helpingNv Camrasto develop cameras targeting the next generation of quantum computers, Riverlane to equip its quantum control systems with advanced readout capabilities and Infleqtion to validate the necessary hardware control layer.

There are many qubit types. This project focuses on the neutral atom qubits that Infleqtion's quantum computing platform uses. Accurate knowledge of the state of these atoms is crucial for the quantum computer to perform its operations. This requires high detection sensitivity, accurate measurements, and low latency to enable real-time image processing and faster operations.

Marie-Eve Ducharme, President and Co-Founder at Nv Camras, said: "Weve been pioneering projects in the space sector for over a decade, but demand for our unique imaging capabilities is exploding in the quantum physics field. This project marks a new milestone for Nv Camras and showcases the transformative potential of our technology in accelerating quantum computing advancements. We are grateful for the contribution of the National Research Council of Canada (NRC-IRAP) to enable this work."

Dr Timothy Ballance, President of Infleqtion UK, said, "Neutral atom quantum computing holds great promise for practical quantum computing through the scalability of atomic qubits compared to alternative methodologies. To truly unlock this scalability, we will need to work hand-in-hand with hardware providers and integrators across the quantum stack to ensure that the sub-systems are interoperable. We are thrilled to collaborate with Riverlane and Nv Camras on this exciting project which will advance high-speed detection of large arrays of atomic qubits."

The project is funded jointly by Innovate UK and the NRC-IRAP through the Canada-UK Commercialising Quantum Technology Programme. Innovate UK is investing 4.2 million in 11 projects to strengthen collaborative research and development through Canada-UK partnerships.

Source:https://www.riverlane.com/

Read more here:
Riverlane Partners with Infleqtion and Nv Cameras to Help Quantum Computers 'See' Their Qubits - AZoOptics