What is Quantum Computing? Webopedia Definition

Main TERM Q

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer’s processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers.

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.

A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such as cryptography and modeling and indexing very large databases.

Microsoft: Quantum Computing 101

Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

Go here to see the original:

What is Quantum Computing? Webopedia Definition

Quantum computers – WIRED UK

Ray Orange

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

D-Wave

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Continue reading here:

Quantum computers – WIRED UK

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: November 1, 2017.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there’s more number-crunching ability in a 21st-century cellphone than you’d have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world’s most powerfulcomputersand there’s no guarantee we’ll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey’ll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today’s, we’ll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the “classical,” sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let’s take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics. Photo courtesy of US Department of Energy.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it’s much moreand much lessthan that. It’s more, because it’s a completely general-purposemachine: you can make it do virtually anything you like. It’sless, because inside it’s little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer’s keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it’s on, we can use a transistor to store a number one(1); if it’s off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat’s the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it’s been doing so eversince. This apparently unshakeable trend is known as Moore’s Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That’s roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we’re talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you’re being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they’ll need to take and the longerthey’ll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore’s Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we’re getting to the point where the laws of physics seem likelyto put a stop to Moore’s Law. Unfortunately, there are still hugelydifficult computing problems we can’t tackle because even the mostpowerful computers find them intractable. That’s one of the reasonswhy people are now getting interested in quantum computing.

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that’s not true: on the atomic scale, the rules change and the “classical” laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: “Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.” (Six Easy Pieces, p116.)

If you’ve studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it’s made up of particles (like a steady stream ofcannonballs), and sometimes as though it’s waves of energy ripplingthrough space (a bit like waves on the sea). That’s called wave-particle dualityand it’s one of the ideas that comes to us from quantum theory. It’s hard to grasp thatsomething can be two things at oncea particle and awavebecause it’s totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that’s just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger’s cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore’s Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can’t. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer’s work, Bennett showed how a computer could circumventthis problem by working in a “reversible” way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University’s David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger’s cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you’re playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they’re added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it’s actually in at any given moment(by measuring it, in other words) does it “collapse” into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer’s ability to work in parallel would make it millions of times faster thanany conventional computer… if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons) or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN! Now you wouldn’t be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it’s locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don’t worry if you don’t understand; not many people do! Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that’s by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the “prime factors” of a large number, whichwould speed up the problem enormously. Shor’s algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan “intractable” computer problem). If quantum computers couldindeed factor large numbers quickly, today’s online security could berendered obsolete at a stroke.

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor’s algorithm, and a search method called Grover’s algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there’s been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM’sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine. Thee years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave’s approach.In March 2015, the Google team announced they were “a step closer to quantum computation,” having developeda new way for qubits to detect and protect against errors.In 2016, MIT’s Isaac Chang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster!There’s no doubt that these are hugely important advances.Even so, it’s very early days for the whole fieldand mostresearchers agree that we’re unlikely to see practical quantumcomputers appearing for many yearsperhaps even decades.

Read the original post:

Quantum computing: A simple introduction – Explain that Stuff

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: November 1, 2017.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there’s more number-crunching ability in a 21st-century cellphone than you’d have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world’s most powerfulcomputersand there’s no guarantee we’ll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey’ll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today’s, we’ll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the “classical,” sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let’s take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics. Photo courtesy of US Department of Energy.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it’s much moreand much lessthan that. It’s more, because it’s a completely general-purposemachine: you can make it do virtually anything you like. It’sless, because inside it’s little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer’s keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it’s on, we can use a transistor to store a number one(1); if it’s off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat’s the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it’s been doing so eversince. This apparently unshakeable trend is known as Moore’s Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That’s roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we’re talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you’re being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they’ll need to take and the longerthey’ll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore’s Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we’re getting to the point where the laws of physics seem likelyto put a stop to Moore’s Law. Unfortunately, there are still hugelydifficult computing problems we can’t tackle because even the mostpowerful computers find them intractable. That’s one of the reasonswhy people are now getting interested in quantum computing.

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that’s not true: on the atomic scale, the rules change and the “classical” laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: “Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.” (Six Easy Pieces, p116.)

If you’ve studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it’s made up of particles (like a steady stream ofcannonballs), and sometimes as though it’s waves of energy ripplingthrough space (a bit like waves on the sea). That’s called wave-particle dualityand it’s one of the ideas that comes to us from quantum theory. It’s hard to grasp thatsomething can be two things at oncea particle and awavebecause it’s totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that’s just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger’s cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore’s Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can’t. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer’s work, Bennett showed how a computer could circumventthis problem by working in a “reversible” way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University’s David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger’s cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you’re playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they’re added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it’s actually in at any given moment(by measuring it, in other words) does it “collapse” into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer’s ability to work in parallel would make it millions of times faster thanany conventional computer… if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons) or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN! Now you wouldn’t be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it’s locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don’t worry if you don’t understand; not many people do! Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that’s by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the “prime factors” of a large number, whichwould speed up the problem enormously. Shor’s algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan “intractable” computer problem). If quantum computers couldindeed factor large numbers quickly, today’s online security could berendered obsolete at a stroke.

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor’s algorithm, and a search method called Grover’s algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there’s been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM’sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine. Thee years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave’s approach.In March 2015, the Google team announced they were “a step closer to quantum computation,” having developeda new way for qubits to detect and protect against errors.In 2016, MIT’s Isaac Chang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster!There’s no doubt that these are hugely important advances.Even so, it’s very early days for the whole fieldand mostresearchers agree that we’re unlikely to see practical quantumcomputers appearing for many yearsperhaps even decades.

View original post here:

Quantum computing: A simple introduction – Explain that Stuff

What is quantum computing? – Definition from WhatIs.com

Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world.

Quantum theory’s development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called “quanta”), as does matter. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don’t look to check.

To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger’s Cat. First, we have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both dead and alive, according to quantum law – in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory.

Which ever argument one chooses, the principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time – that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.

The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we’ll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement .

Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron’s spin from one state to another is achieved by using a pulse of energy, such as from a laser – let’s say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number – 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing – classical computers today, even so called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.

Entanglement Particles (such as photons, electrons, or qubits) that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation . Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superpostition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein called it “spooky action at a distance”), the mechanism of which cannot, as yet, be explained by any theory – it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Perhaps even more intriguing than the sheer power of quantum computing is the ability that it offers to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of “take all the superpositions of all the prior computations” – something which is meaningless with a classical computer – which would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers, one example of which we discuss below.

There have been two notable successes thus far with quantum programming. The first occurred in 1994 by Peter Shor, (now at AT&T Labs) who developed a quantum algorithm that could efficiently factorize large numbers. It centers on a system that uses number theory to estimate the periodicity of a large number sequence. The other major breakthrough happened with Lov Grover of Bell Labs in 1996, with a very fast algorithm that is proven to be the fastest possible for searching through unstructured databases. The algorithm is so efficient that it requires only, on average, roughly N square root (where N is the total number of elements) searches to find the desired result, as opposed to a search in classical computing, which on average needs N/2 searches.

The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the problems with quantum computing are as follows:

Even though there are many problems to overcome, the breakthroughs in the last 15 years, and especially in the last 3, have made some form of practical quantum computing not unfeasible, but there is much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis. It is this potential that is rapidly breaking down the barriers to this technology, but whether all barriers can be broken, and when, is very much an open question.

Read the original post:

What is quantum computing? – Definition from WhatIs.com

What is Quantum Computing? Webopedia Definition

Main TERM Q

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer’s processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers.

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.

A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such as cryptography and modeling and indexing very large databases.

Microsoft: Quantum Computing 101

Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

Continue reading here:

What is Quantum Computing? Webopedia Definition

Is Quantum Computing an Existential Threat to Blockchain …

Amid steep gains in value and wild headlines, its easy to forget cryptocurrencies and blockchain arent yet mainstream. Even so, fans of the technology believe blockchain has too much potential not to have a major sustained impact in the future.

But as is usually the case when pondering whats ahead, nothing is certain.

When considering existential threats to blockchain and cryptocurrencies, people generally focus on increased regulation. And this makes sense. In the medium term, greater regulation may stand in the way of cryptocurrencies and wider mainstream adoption. However, there might be a bigger threat further out on the horizon.

Much of blockchains allure arises from its security benefits. The tech allows a ledger of transactions to be distributed between a large network of computers. No single user can break into and change the ledger. This makes it both public and secure.

But combined with another emerging (and much hyped) technology, quantum computing, blockchains seemingly immutable ledgers would be under threat.

Like blockchain, quantum computing has been making progress and headlines too.

The number of quantum computing companies and researchers continues to grow. And while there is a lot of focus on hardware, many are looking into the software as well.

Cryptography is a commonly debated topic because quantum computing poses a threat to traditional forms of computer security, most notably public key cryptography, which undergirds most online communications and most current blockchain technology.

But first, how does computer security work today?

Public key cryptography uses a pair of keys to encrypt information: a public key which can be shared widely and a private key known only to the keys owner. Anyone can encrypt a message using the intended receivers public key, but only the receiver can decrypt the message using her private key. The more difficult it is to determine a private key from its corresponding public key, the more secure the system.

The best public key cryptography systems link public and private keys using the factors of a number that is the product of two incredibly large prime numbers. To determine the private key from the public key alone, one would have to figure out the factors of this product of primes. Even if a classical computer tested a trillion keys a second, it would take up to 785 million times longer than the roughly 14 billion years the universe has existed so far due to the size of the prime numbers in question.

If processing power were to greatly increase, however, then it might become possible for an entity exercising such computing power to generate a private key from the corresponding public key. If actors could generate private keys from corresponding public keys, then even the strongest forms of traditional public key cryptography would be vulnerable.

This is where quantum computing comes in. Quantum computing relies on quantum physics and has more potential power than any traditional form of computing.

Quantum computing takes advantage of quantum bits or qubits that can exist in any superposition of values between 0 and 1 and can therefore process much more information than just 0 or 1, which is the limit of classical computing systems.

The capacity to compute using qubits renders quantum computers many orders of magnitude faster than classical computers. Google showed a D-Wave quantum annealing computer could be 100 million times faster than classical computers at certain specialized tasks. And Google and IBM are working on their own quantum computers.

Further, although there are but a handful of quantum computing algorithms, one of the most famous ones, Shors algorithm, allows for the quick factoring of large primes. Therefore, a working quantum computer could, in theory, break todays public key cryptography.

Quantum computers capable of speedy number factoring are not here yet. However, if quantum computing continues to progress, it will get there eventually. And when it does, this advance will pose an existential threat to public key cryptography, and the blockchain technology that relies on it, including Bitcoin, will be vulnerable to hacking.

So, is blockchain security therefore impossible in a post-quantum world? Will the advent of quantum computing render blockchain technology obsolete?

Maybe, but not if we can develop a solution first.

The NSA announced in 2015 that it was moving to implement quantum-resistant cryptographic systems. Cryptographers are working on quantum-resistant cryptography, and there are already blockchain projects implementing quantum-resistant cryptography. The Quantum Resistant Ledger team, for example, is working on building such a blockchain right now.

What makes quantum-resistant or post-quantum cryptography, quantum resistant? When private keys are generated from public keys in ways that are much more mathematically complex than traditional prime factorization.

The Quantum Resistant Ledger team is working to implement hash-based cryptography, a form of post-quantum cryptography. In hash-based cryptography, private keys are generated from public keys using complex hash-based cryptographic structures, rather than prime number factorization. The connection between the public and private key pair is therefore much more complex than in traditional public key cryptography and would be much less vulnerable to a quantum computer running Shors algorithm.

These post-quantum cryptographic schemes do not need to run on quantum computers. The Quantum Resistant Ledger is a blockchain project already working to implement post-quantum cryptography. It remains to be seen how successful the effort and others like it will prove when full-scale quantum computing becomes a practical reality.

To be clear, quantum computing threatens all computer security systems that rely on public key cryptography, not just blockchain. All security systems, including blockchain systems, need to consider post-quantum cryptography to maintain data security for their systems. But the easiest and most efficient route may be to replace traditional systems with blockchain systems that implement quantum-resistant cryptography.

Disclosure: The author owns assorted digital assets. The author is also a principal at Crypto Lotus LLC, a cryptocurrency hedge fund based out of the San Francisco Bay Area, and an advisor at Green Sands Equity, both of which have positions in various digital assets. All opinions in this post are the authors alone and not those of Singularity University, Crypto Lotus, or Green Sands Equity. This post is not an endorsement by Singularity University, Crypto Lotus, or Green Sands Equity of any asset, and you should be aware of the risk of loss before trading or holding any digital asset.

Image Credit: Morrowind /Shutterstock.com

Read more from the original source:

Is Quantum Computing an Existential Threat to Blockchain …

Quantum computing in the NISQ era and beyond

Quantum computing in the NISQ era and beyond Preskill, Q2B 2017

Todays paper is based on the keynote address given by John Preskill at the December 2017 Quantum computing for business conference. It provides a great overview of the state of quantum computing today, and what we might reasonably expect to see over the coming years.

we are now entering a pivotal new era in quantum technology. For this talk, I needed a name to describe this impending new era, so I made up a word: NISQ. This stands for Noisy Intermediate Scale Quantum.

Intermediate scale refers to computers with between 50 and a few hundred qubits. The 50 qubit milestone is significant because that takes us beyond what we can simulate by brute force using the most powerful existing supercomputers. Noisy emphasises that well have imperfect control over those qubits. Because of the noise, we expect a limit of about 1000 gates in a circuit i.e., 1000 fundamental two-qubit operations. Executing a single gate is about 1000 times slower on an ion trap quantum processor than on a superconducting circuit.

Eventually we expect to be able to protect quantum systems and scale up quantum computers using the principle of quantum error correction Unfortunately, there is a significant overhead cost for doing quantum error correction, so reliable quantum computers using quantum error correction are not likely to be available very soon.

For example, using quantum error correction we would need physical systems with millions of qubits in order to run algorithms involving thousands of protected (fault-tolerant) qubits. For the next few years, our limit is on the order of a hundred physical qubits.

Crossing the quantum chasm, from hundreds of physical qubits to millions of physical qubits, is going to take some time, but well get there eventually It is important to realize that we will need significant advances in basic science as well as in systems engineering to attain fully scalable fault-tolerant quantum computers.

What about those D-Wave machines available today then, which already have 2000 qubits? Its complicated, but these are not circuit-based quantum computers, rather they are quantum annealers. Well take more about those later.

If quantum error correction is our basis for thinking that quantum computers will be scalable to large devices solving hard problems, quantum complexity is our basis for thinking that quantum computing is powerful. We have at least three good reasons for thinking that quantum computers have capabilities surpassing classical computers:

Its a remarkable claim one of the most amazing ideas Ive encountered in my scientific life that there is a distinction between problems that are classically hard and problems that are quantumly hard. And it is a compelling challenge to understand better what problems are classically hard but quantumly easy.

We dont expect quantum computers to be able to solve the hard instances of NP-hard problems. But when will quantum computers be able to solve problems we care about faster than classical computers, and for what problems?

Quantum speedup refers to a quantum computer solving a problem faster than competing classical computers using the best available hardware and running the best algorithm which performs the same task.

A few years ago I spoke enthusiastically about quantum supremacy as an impeding milestone for human civilization. I suggested this term as a way to characterize computational tasks performable by quantum devices, where one could argue persuasively that no existing (or easily foreseeable) classical device could perform the same task, disregarding whether the task is useful in any other respect But from a commercial perspective, obviously we should pay attention to whether the task is useful!

Preskill then goes on to outline several areas where quantum computers hold promise for outperforming their classical cousins, including optimisation, deep learning, matrix inversion, recommendation systems, semidefinite programming, and quantum simulation. Lets take a brief look at each of them.

We dont expect quantum computers to be able to efficiently solve worst case instances of NP-hard problems (such as combinatorial optimisation), but they might be better than classical computers at finding approximate solutions. That is, they might find better approximations, and/or they might find approximations faster.

For many problems there is a big gap between the approximation achieved by the best classical algorithm we currently know and the barrier of NP-hardness. So it would not be shocking to discover that quantum computers have an advantage over classical ones for the task of finding approximate solutions, an advantage some users might find quite valuable.

The emerging approach is a hybrid quantum-classical algorithm where a quantum processor prepares an n-qubit state, and a classical optimiser processes the measurement outcomes, instructing the quantum processor how to alter the n-qubit state for the next iteration. Iteration continues until the algorithm converges on a quantum state from which the approximate solution can be extracted.

If applied to classical approximation problems, this goes by the name Quantum Approximate Optimization Algorithm (QAOA), when applied to quantum problems it is called a Variation Quantum Eigensolver (VQE).

Quantum annealers (such as the DWave 2000Q machine) are noisy versions of something called adiabatic quantum computing, and we dont have a convincing theoretical argument indicating that they can outperform the best classical hardware. (We do in the noiseless version). So far quantum annealers have mostly been applied to cases where the annealing is stochastic, which means it is relatively easy for a classical computer to simulate what the quantum annealer is doing.

Whats coming soon are non-stochastic quantum annealers, which may have greater potential for achieving speedups over what the best classical algorithms can do.

In quantum deep learning (or just quantum machine learning) we can construct quantum analogs of e.g. a restricted Boltzmann machine, but with the spins represented by qubits rather than classical bits.

It may be that quantum deep learning networks have advantages over classical ones; for example they might be easier to train for some purposes. But we dont really know its another opportunity to try it and see how well it works. One possible reason for being hopeful about the potential of quantum machine learning rests on the concept known as QRAM quantum random access memory.

QRAM can represent a large amount of classical data very succinctly, encoding a vector with_N_ components in just log N qubits. Even with QRAM though, we have to take into account the costs of encoding the input into QRAM in the first place. Moreover, when reading the results we can recover only log N classical bits (not N) from a single shot measurement of log N qubits.

Thus quantum deep learning has most advantage in scenarios where both the input and output are quantum states. quantum deep learning networks might be very well suited for quantum tasks, but for applications of deep learning that are widely pursued today it is unclear why quantum networks would have an advantage.

QRAM also helps with matrix inversion, where the HHL algorithm gives an exponential quantum speedup, running in time O(log N). Once again, we have to pay encoding costs though if applying it to classical data.

We do have good reason to believe this quantum matrix inversion algorithm is powerful, because it solves what we call a BQP-complete problem. That is, any problem that can be solved efficiently with a quantum computer can be encoded as an instance of this matrix inversion problem.

Unfortunately, HHL is not likely to be feasible in the NISQ era, the algorithm is probably just too expensive to be executed successfully by a quantum computer which doesnt use error correction.

A quantum algorithm has been proposed which gives an exponential speedup over the best currently known classical algorithm for the task of making high-value recommendations.

The goal is to recommend a product to a customer that the customer will probably like, based on limited knowledge of the preferences of that customer and other customers.

Whereas the classical algorithm attempts to reconstruct the full recommendation matrix, the quantum one samples efficiently from a low-rank approximation to the preference matrix.

This is a significant quantum speedup for a real-world application of machine learning, encouraging the hope that other such speedups will be discovered. However, we dont currently have a convincing theoretical argument indicating that the task performed by quantum recommendation systems (returning a high-quality recommendation in polylog(mn) time) is impossible classically.

Alas, the algorithm is also probably too costly to be convincingly validated in the NISQ era.

Semidefinite programming is the task of optimising a linear function subject to matrix inequality constraints. Convex optimization problems of this type have widespread applications. A recently discovered quantum algorithm finds an approximate solutions to the problem in time polylog(N), an exponential speedup.

The good news: a quantum solver for semidefinite programs might be within reach of NISQ technology!

Finally, quantum computers should be really good for running quantum simulations! We can get started in the NISQ era, but the most exciting discoveries are perhaps beyond it. The theory of classical chaos advanced rapidly once we could simulate chaotic dynamical systems. Quantum simulation may promote similar advances in our understanding of quantum chaos.

Valuable insights might already be gleaned using noisy devices with of order 100 qubits.

Quantum technology is rife with exhilarating opportunities, and surely many rousing surprises lie ahead. But the challenges we face are still formidable. All quantumists should appreciate that our field can fulfill its potential only through sustained, inspired effort over decades. If we pay that price, the ultimate rewards will more than vindicate our efforts.

Like Loading…

Related

See the rest here:

Quantum computing in the NISQ era and beyond

The Era of Quantum Computing Is Here. Outlook: Cloudy | WIRED

After decades of heavy slog with no promise of success, quantum computing is suddenly buzzing with almost feverish excitement and activity. Nearly two years ago, IBM made a quantum computer available to the world: the 5-quantum-bit (qubit) resource they now call (a little awkwardly) the IBM Q experience. That seemed more like a toy for researchers than a way of getting any serious number crunching done. But 70,000 users worldwide have registered for it, and the qubit count in this resource has now quadrupled. In the past few months, IBM and Intel have announced that they have made quantum computers with 50 and 49 qubits, respectively, and Google is thought to have one waiting in the wings. There is a lot of energy in the community, and the recent progress is immense, said physicist Jens Eisert of the Free University of Berlin.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

There is now talk of impending quantum supremacy: the moment when a quantum computer can carry out a task beyond the means of todays best classical supercomputers. That might sound absurd when you compare the bare numbers: 50 qubits versus the billions of classical bits in your laptop. But the whole point of quantum computing is that a quantum bit counts for much, much more than a classical bit. Fifty qubits has long been considered the approximate number at which quantum computing becomes capable of calculations that would take an unfeasibly long time classically. Midway through 2017, researchers at Google announced that they hoped to have demonstrated quantum supremacy by the end of the year. (When pressed for an update, a spokesperson recently said that we hope to announce results as soon as we can, but were going through all the detailed work to ensure we have a solid result before we announce.)

It would be tempting to conclude from all this that the basic problems are solved in principle and the path to a future of ubiquitous quantum computing is now just a matter of engineering. But that would be a mistake. The fundamental physics of quantum computing is far from solved and cant be readily disentangled from its implementation.

Even if we soon pass the quantum supremacy milestone, the next year or two might be the real crunch time for whether quantum computers will revolutionize computing. Theres still everything to play for and no guarantee of reaching the big goal.

IBMs quantum computing center at the Thomas J. Watson Research Center in Yorktown Heights, New York, holds quantum computers in large cryogenic tanks (far right) that are cooled to a fraction of a degree above absolute zero.

Connie Zhou for IBM

Both the benefits and the challenges of quantum computing are inherent in the physics that permits it. The basic story has been told many times, though not always with the nuance that quantum mechanics demands. Classical computers encode and manipulate information as strings of binary digits1 or 0. Quantum bits do the same, except that they may be placed in a so-called superposition of the states 1 and 0, which means that a measurement of the qubits state could elicit the answer 1 or 0 with some well-defined probability.

To perform a computation with many such qubits, they must all be sustained in interdependent superpositions of statesa quantum-coherent state, in which the qubits are said to be entangled. That way, a tweak to one qubit may influence all the others. This means that somehow computational operations on qubits count for more than they do for classical bits. The computational resources increase in simple proportion to the number of bits for a classical device, but adding an extra qubit potentially doubles the resources of a quantum computer. This is why the difference between a 5-qubit and a 50-qubit machine is so significant.

Note that Ive not saidas it often is saidthat a quantum computer has an advantage because the availability of superpositions hugely increases the number of states it can encode, relative to classical bits. Nor have I said that entanglement permits many calculations to be carried out in parallel. (Indeed, a strong degree of qubit entanglement isnt essential.) Theres an element of truth in those descriptionssome of the timebut none captures the essence of quantum computing.

Inside one of IBMs cryostats wired for a 50-qubit quantum system.

Connie Zhou for IBM

Its hard to say qualitatively why quantum computing is so powerful precisely because it is hard to specify what quantum mechanics means at all. The equations of quantum theory certainly show that it will work: that, at least for some classes of computation such as factorization or database searches, there is tremendous speedup of the calculation. But how exactly?

Perhaps the safest way to describe quantum computing is to say that quantum mechanics somehow creates a resource for computation that is unavailable to classical devices. As quantum theorist Daniel Gottesman of the Perimeter Institute in Waterloo, Canada, put it, If you have enough quantum mechanics available, in some sense, then you have speedup, and if not, you dont.

Some things are clear, though. To carry out a quantum computation, you need to keep all your qubits coherent. And this is very hard. Interactions of a system of quantum-coherent entities with their surrounding environment create channels through which the coherence rapidly leaks out in a process called decoherence. Researchers seeking to build quantum computers must stave off decoherence, which they can currently do only for a fraction of a second. That challenge gets ever greater as the number of qubitsand hence the potential to interact with the environmentincreases. This is largely why, even though quantum computing was first proposed by Richard Feynman in 1982 and the theory was worked out in the early 1990s, it has taken until now to make devices that can actually perform a meaningful computation.

Theres a second fundamental reason why quantum computing is so difficult. Like just about every other process in nature, it is noisy. Random fluctuations, from heat in the qubits, say, or from fundamentally quantum-mechanical processes, will occasionally flip or randomize the state of a qubit, potentially derailing a calculation. This is a hazard in classical computing too, but its not hard to deal withyou just keep two or more backup copies of each bit so that a randomly flipped bit stands out as the odd one out.

Researchers working on quantum computers have created strategies for how to deal with the noise. But these strategies impose a huge debt of computational overheadall your computing power goes to correcting errors and not to running your algorithms. Current error rates significantly limit the lengths of computations that can be performed, said Andrew Childs, the codirector of the Joint Center for Quantum Information and Computer Science at the University of Maryland. Well have to do a lot better if we want to do something interesting.

Andrew Childs, a quantum theorist at the University of Maryland, cautions that error rates are a fundamental concern for quantum computers.

Photo by John T. Consoli/University of Maryland

A lot of research on the fundamentals of quantum computing has been devoted to error correction. Part of the difficulty stems from another of the key properties of quantum systems: Superpositions can only be sustained as long as you dont measure the qubits value. If you make a measurement, the superposition collapses to a definite value: 1 or 0. So how can you find out if a qubit has an error if you dont know what state it is in?

One ingenious scheme involves looking indirectly, by coupling the qubit to another ancilla qubit that doesnt take part in the calculation but that can be probed without collapsing the state of the main qubit itself. Its complicated to implement, though. Such solutions mean that, to construct a genuine logical qubit on which computation with error correction can be performed, you need many physical qubits.

How many? Quantum theorist Aln Aspuru-Guzik of Harvard University estimates that around 10,000 of todays physical qubits would be needed to make a single logical qubita totally impractical number. If the qubits get much better, he said, this number could come down to a few thousand or even hundreds. Eisert is less pessimistic, saying that on the order of 800 physical qubits might already be enough, but even so he agrees that the overhead is heavy, and for the moment we need to find ways of coping with error-prone qubits.

An alternative to correcting errors is avoiding them or canceling out their influence: so-called error mitigation. Researchers at IBM, for example, are developing schemes for figuring out mathematically how much error is likely to have been incurred in a computation and then extrapolating the output of a computation to the zero noise limit.

Some researchers think that the problem of error correction will prove intractable and will prevent quantum computers from achieving the grand goals predicted for them. The task of creating quantum error-correcting codes is harder than the task of demonstrating quantum supremacy, said mathematician Gil Kalai of the Hebrew University of Jerusalem in Israel. And he adds that devices without error correction are computationally very primitive, and primitive-based supremacy is not possible. In other words, youll never do better than classical computers while youve still got errors.

Others believe the problem will be cracked eventually. According to Jay Gambetta, a quantum information scientist at IBMs Thomas J. Watson Research Center, Our recent experiments at IBM have demonstrated the basic elements of quantum error correction on small devices, paving the way towards larger-scale devices where qubits can reliably store quantum information for a long period of time in the presence of noise. Even so, he admits that a universal fault-tolerant quantum computer, which has to use logical qubits, is still a long way off. Such developments make Childs cautiously optimistic. Im sure well see improved experimental demonstrations of [error correction], but I think it will be quite a while before we see it used for a real computation, he said.

For the time being, quantum computers are going to be error-prone, and the question is how to live with that. At IBM, researchers are talking about approximate quantum computing as the way the field will look in the near term: finding ways of accommodating the noise.

This calls for algorithms that tolerate errors, getting the correct result despite them. Its a bit like working out the outcome of an election regardless of a few wrongly counted ballot papers. A sufficiently large and high-fidelity quantum computation should have some advantage [over a classical computation] even if it is not fully fault-tolerant, said Gambetta.

Lucy Reading-Ikkanda/Quanta Magazine

One of the most immediate error-tolerant applications seems likely to be of more value to scientists than to the world at large: to simulate stuff at the atomic level. (This, in fact, was the motivation that led Feynman to propose quantum computing in the first place.) The equations of quantum mechanics prescribe a way to calculate the propertiessuch as stability and chemical reactivityof a molecule such as a drug. But they cant be solved classically without making lots of simplifications.

In contrast, the quantum behavior of electrons and atoms, said Childs, is relatively close to the native behavior of a quantum computer. So one could then construct an exact computer model of such a molecule. Many in the community, including me, believe that quantum chemistry and materials science will be one of the first useful applications of such devices, said Aspuru-Guzik, who has been at the forefront of efforts to push quantum computing in this direction.

Quantum simulations are proving their worth even on the very small quantum computers available so far. A team of researchers including Aspuru-Guzik has developed an algorithm that they call the variational quantum eigensolver (VQE), which can efficiently find the lowest-energy states of molecules even with noisy qubits. So far it can only handle very small molecules with few electrons, which classical computers can already simulate accurately. But the capabilities are getting better, as Gambetta and coworkers showed last September when they used a 6-qubit device at IBM to calculate the electronic structures of molecules, including lithium hydride and beryllium hydride. The work was a significant leap forward for the quantum regime, according to physical chemist Markus Reiher of the Swiss Federal Institute of Technology in Zurich, Switzerland. The use of the VQE for the simulation of small molecules is a great example of the possibility of near-term heuristic algorithms, said Gambetta.

But even for this application, Aspuru-Guzik confesses that logical qubits with error correction will probably be needed before quantum computers truly begin to surpass classical devices. I would be really excited when error-corrected quantum computing begins to become a reality, he said.

If we had more than 200 logical qubits, we could do things in quantum chemistry beyond standard approaches, Reiher adds. And if we had about 5,000 such qubits, then the quantum computer would be transformative in this field.

Despite the challenges of reaching those goals, the fast growth of quantum computers from 5 to 50 qubits in barely more than a year has raised hopes. But we shouldnt get too fixated on these numbers, because they tell only part of the story. What matters is not justor even mainlyhow many qubits you have, but how good they are, and how efficient your algorithms are.

Any quantum computation has to be completed before decoherence kicks in and scrambles the qubits. Typically, the groups of qubits assembled so far have decoherence times of a few microseconds. The number of logic operations you can carry out during that fleeting moment depends on how quickly the quantum gates can be switchedif this time is too slow, it really doesnt matter how many qubits you have at your disposal. The number of gate operations needed for a calculation is called its depth: Low-depth (shallow) algorithms are more feasible than high-depth ones, but the question is whether they can be used to perform useful calculations.

Whats more, not all qubits are equally noisy. In theory it should be possible to make very low-noise qubits from so-called topological electronic states of certain materials, in which the shape of the electron states used for encoding binary information confers a kind of protection against random noise. Researchers at Microsoft, most prominently, are seeking such topological states in exotic quantum materials, but theres no guarantee that theyll be found or will be controllable.

Researchers at IBM have suggested that the power of a quantum computation on a given device be expressed as a number called the quantum volume, which bundles up all the relevant factors: number and connectivity of qubits, depth of algorithm, and other measures of the gate quality, such as noisiness. Its really this quantum volume that characterizes the power of a quantum computation, and Gambetta said that the best way forward right now is to develop quantum-computational hardware that increases the available quantum volume.

This is one reason why the much vaunted notion of quantum supremacy is more slippery than it seems. The image of a 50-qubit (or so) quantum computer outperforming a state-of-the-art supercomputer sounds alluring, but it leaves a lot of questions hanging. Outperforming for which problem? How do you know the quantum computer has got the right answer if you cant check it with a tried-and-tested classical device? And how can you be sure that the classical machine wouldnt do better if you could find the right algorithm?

So quantum supremacy is a concept to handle with care. Some researchers prefer now to talk about quantum advantage, which refers to the speedup that quantum devices offer without making definitive claims about what is best. An aversion to the word supremacy has also arisen because of the racial and political implications.

Whatever you choose to call it, a demonstration that quantum computers can do things beyond current classical means would be psychologically significant for the field. Demonstrating an unambiguous quantum advantage will be an important milestone, said Eisertit would prove that quantum computers really can extend what is technologically possible.

That might still be more of a symbolic gesture than a transformation in useful computing resources. But such things may matter, because if quantum computing is going to succeed, it wont be simply by the likes of IBM and Google suddenly offering their classy new machines for sale. Rather, itll happen through an interactive and perhaps messy collaboration between developers and users, and the skill set will evolve in the latter only if they have sufficient faith that the effort is worth it. This is why both IBM and Google are keen to make their devices available as soon as theyre ready. As well as a 16-qubit IBM Q experience offered to anyone who registers online, IBM now has a 20-qubit version for corporate clients, including JP Morgan Chase, Daimler, Honda, Samsung and the University of Oxford. Not only will that help clients discover whats in it for them; it should create a quantum-literate community of programmers who will devise resources and solve problems beyond what any individual company could muster.

For quantum computing to take traction and blossom, we must enable the world to use and to learn it, said Gambetta. This period is for the world of scientists and industry to focus on getting quantum-ready.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Read more here:

The Era of Quantum Computing Is Here. Outlook: Cloudy | WIRED

Quantum computers – WIRED UK

Wikimedia Commons

In a world where we are relying increasingly on computing, to share our information and store our most precious data, the idea of living without computers might baffle most people.

But if we continue to follow the trend that has been in place since computers were introduced, by 2040 we will not have the capability to power all of the machines around the globe, according to a recent report by the Semiconductor Industry Association.

To prevent this, the industry is focused on finding ways to make computing more energy efficient, but classical computers are limited by the minimum amount of energy it takes them to perform one operation.

This energy limit is named after IBM Research Lab’s Rolf Landauer, who in 1961 found that in any computer, each single bit operation must use an absolute minimum amount of energy. Landauer’s formula calculated the lowest limit of energy required for a computer operation, and in March this year researchers demonstrated it could be possible to make a chip that operates with this lowest energy.

It was called a “breakthrough for energy-efficient computing” and could cut the amount of energy used in computers by a factor of one million. However, it will take a long time before we see the technology used in our laptops; and even when it is, the energy will still be above the Landauer limit.

This is why, in the long term, people are turning to radically different ways of computing, such as quantum computing, to find ways to cut energy use.

What is quantum computing?

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“Traditionally qubits are treated as separated physical objects with two possible distinguishable states, 0 and 1,” Alexey Fedorov, physicist at the Moscow Institute of Physics and Technology told WIRED.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’.”

D-Wave

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states – at either of the two poles of the sphere – a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Last year, a team of Google and Nasa scientists found a D-wave quantum computer was 100 million times faster than a conventional computer. But moving quantum computing to an industrial scale is difficult.

IBM recently announced its Q division is developing quantum computers that can be sold commercially within the coming years. Commercial quantum computer systems “with ~50 qubits” will be created “in the next few years,” IBM claims. While researchers at Google, in Nature comment piece, say companies could start to make returns on elements of quantum computer technology within the next five years.

Computations occur when qubits interact with each other, therefore for a computer to function it needs to have many qubits. The main reason why quantum computers are so hard to manufacture is that scientists still have not found a simple way to control complex systems of qubits.

Now, scientists from Moscow Institute of Physics and Technology and Russian Quantum Centre are looking into an alternative way of quantum computing. Not content with single qubits, the researchers decided to tackle the problem of quantum computing another way.

“In our approach, we observed that physical nature allows us to employ quantum objects with several distinguishable states for quantum computation,” Fedorov, one of the authors of the study, told WIRED.

The team created qubits with various different energy “levels”, that they have named qudits. The “d” stands for the number of different energy levels the qudit can take. The term “level” comes from the fact that typically each logic state of a qubit corresponds to the state with a certain value of energy – and these values of possible energies are called levels.

“In some sense, we can say that one qudit, quantum object with d possible states, may consist of several ‘virtual’ qubits, and operating qudit corresponds to manipulation with the ‘virtual’ qubits including their interaction,” continued Federov.

“From the viewpoint of abstract quantum information theory everything remains the same but in concrete physical implementation many-level system represent potentially useful resource.”

Quantum computers are already in use, in the sense that logic gates have been made using two qubits, but getting quantum computers to work on an industrial scale is the problem.

“The progress in that field is rather rapid but no one can promise when we come to wide use of quantum computation,” Fedorov told WIRED.

Elsewhere, in a step towards quantum computing, researchers have guided electrons through semiconductors using incredibly short pulses of light.

These extremely short, configurable pulses of light could lead to computers that operate 100,000 times faster than they do today. Researchers, including engineers at the University of Michigan, can now control peaks within laser pulses of just a few femtoseconds (one quadrillionth of a second) long. The result is a step towards “lightwave electronics” which could eventually lead to a breakthrough in quantum computing.

A bizarre discovery recently revealed that cold helium atoms in lab conditions on Earth abide by the same law of entropy that governs the behaviour of black holes.

The law, first developed by Professor Stephen Hawking and Jacob Bekenstein in the 1970s, describes how the entropy, or the amount of disorder, increases in a black hole when matter falls into it. It now seems this behaviour appears at both the huge scales of outer space and at the tiny scale of atoms, specifically those that make up superfluid helium.

“It’s called an entanglement area law, explained Adrian Del Maestro, physicist at the University of Vermont. “It points to a deeper understanding of reality and could be a significant step toward a long-sought quantum theory of gravity and new advances in quantum computing.

More:

Quantum computers – WIRED UK

What is quantum computing? – Definition from WhatIs.com

Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world.

Quantum theory’s development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called “quanta”), as does matter. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don’t look to check.

To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger’s Cat. First, we have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both dead and alive, according to quantum law – in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory.

Which ever argument one chooses, the principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time – that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.

The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we’ll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement .

Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron’s spin from one state to another is achieved by using a pulse of energy, such as from a laser – let’s say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number – 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing – classical computers today, even so called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.

Entanglement Particles (such as photons, electrons, or qubits) that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation . Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superpostition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein called it “spooky action at a distance”), the mechanism of which cannot, as yet, be explained by any theory – it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Perhaps even more intriguing than the sheer power of quantum computing is the ability that it offers to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of “take all the superpositions of all the prior computations” – something which is meaningless with a classical computer – which would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers, one example of which we discuss below.

There have been two notable successes thus far with quantum programming. The first occurred in 1994 by Peter Shor, (now at AT&T Labs) who developed a quantum algorithm that could efficiently factorize large numbers. It centers on a system that uses number theory to estimate the periodicity of a large number sequence. The other major breakthrough happened with Lov Grover of Bell Labs in 1996, with a very fast algorithm that is proven to be the fastest possible for searching through unstructured databases. The algorithm is so efficient that it requires only, on average, roughly N square root (where N is the total number of elements) searches to find the desired result, as opposed to a search in classical computing, which on average needs N/2 searches.

The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the problems with quantum computing are as follows:

Even though there are many problems to overcome, the breakthroughs in the last 15 years, and especially in the last 3, have made some form of practical quantum computing not unfeasible, but there is much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis. It is this potential that is rapidly breaking down the barriers to this technology, but whether all barriers can be broken, and when, is very much an open question.

Go here to read the rest:

What is quantum computing? – Definition from WhatIs.com

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: November 1, 2017.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there’s more number-crunching ability in a 21st-century cellphone than you’d have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world’s most powerfulcomputersand there’s no guarantee we’ll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey’ll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today’s, we’ll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the “classical,” sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let’s take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics. Photo courtesy of US Department of Energy.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it’s much moreand much lessthan that. It’s more, because it’s a completely general-purposemachine: you can make it do virtually anything you like. It’sless, because inside it’s little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer’s keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it’s on, we can use a transistor to store a number one(1); if it’s off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat’s the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it’s been doing so eversince. This apparently unshakeable trend is known as Moore’s Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That’s roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we’re talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you’re being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they’ll need to take and the longerthey’ll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore’s Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we’re getting to the point where the laws of physics seem likelyto put a stop to Moore’s Law. Unfortunately, there are still hugelydifficult computing problems we can’t tackle because even the mostpowerful computers find them intractable. That’s one of the reasonswhy people are now getting interested in quantum computing.

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that’s not true: on the atomic scale, the rules change and the “classical” laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: “Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.” (Six Easy Pieces, p116.)

If you’ve studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it’s made up of particles (like a steady stream ofcannonballs), and sometimes as though it’s waves of energy ripplingthrough space (a bit like waves on the sea). That’s called wave-particle dualityand it’s one of the ideas that comes to us from quantum theory. It’s hard to grasp thatsomething can be two things at oncea particle and awavebecause it’s totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that’s just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger’s cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore’s Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can’t. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer’s work, Bennett showed how a computer could circumventthis problem by working in a “reversible” way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University’s David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger’s cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you’re playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they’re added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it’s actually in at any given moment(by measuring it, in other words) does it “collapse” into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer’s ability to work in parallel would make it millions of times faster thanany conventional computer… if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons) or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN! Now you wouldn’t be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it’s locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don’t worry if you don’t understand; not many people do! Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that’s by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the “prime factors” of a large number, whichwould speed up the problem enormously. Shor’s algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan “intractable” computer problem). If quantum computers couldindeed factor large numbers quickly, today’s online security could berendered obsolete at a stroke.

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor’s algorithm, and a search method called Grover’s algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there’s been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM’sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine. Thee years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave’s approach.In March 2015, the Google team announced they were “a step closer to quantum computation,” having developeda new way for qubits to detect and protect against errors.In 2016, MIT’s Isaac Chang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster!There’s no doubt that these are hugely important advances.Even so, it’s very early days for the whole fieldand mostresearchers agree that we’re unlikely to see practical quantumcomputers appearing for many yearsperhaps even decades.

Read the original post:

Quantum computing: A simple introduction – Explain that Stuff

What is Quantum Computing? Webopedia Definition

Main TERM Q

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer’s processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers.

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.

A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such as cryptography and modeling and indexing very large databases.

Microsoft: Quantum Computing 101

Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

Read the original here:

What is Quantum Computing? Webopedia Definition

The Era of Quantum Computing Is Here. Outlook: Cloudy …

A lot of research on the fundamentals of quantum computing has been devoted to error correction. Part of the difficulty stems from another of the key properties of quantum systems: Superpositions can only be sustained as long as you dont measure the qubits value. If you make a measurement, the superposition collapses to a definite value: 1 or 0. So how can you find out if a qubit has an error if you dont know what state it is in?

One ingenious scheme involves looking indirectly, by coupling the qubit to another ancilla qubit that doesnt take part in the calculation but that can be probed without collapsing the state of the main qubit itself. Its complicated to implement, though. Such solutions mean that, to construct a genuine logical qubit on which computation with error correction can be performed, you need many physical qubits.

How many? Quantum theorist Aln Aspuru-Guzik of Harvard University estimates that around 10,000 of todays physical qubits would be needed to make a single logical qubit a totally impractical number. If the qubits get much better, he said, this number could come down to a few thousand or even hundreds. Eisert is less pessimistic, saying that on the order of 800 physical qubits might already be enough, but even so he agrees that the overhead is heavy, and for the moment we need to find ways of coping with error-prone qubits.

An alternative to correcting errors is avoiding them or canceling out their influence: so-called error mitigation. Researchers at IBM, for example, are developing schemes for figuring out mathematically how much error is likely to have been incurred in a computation and then extrapolating the output of a computation to the zero noise limit.

Some researchers think that the problem of error correction will prove intractable and will prevent quantum computers from achieving the grand goals predicted for them. The task of creating quantum error-correcting codes is harder than the task of demonstrating quantum supremacy, said mathematician Gil Kalai of the Hebrew University of Jerusalem in Israel. And he adds that devices without error correction are computationally very primitive, and primitive-based supremacy is not possible. In other words, youll never do better than classical computers while youve still got errors.

Others believe the problem will be cracked eventually. According to Jay Gambetta, a quantum information scientist at IBMs Thomas J. Watson Research Center, Our recent experiments at IBM have demonstrated the basicelementsof quantum error correction onsmalldevices, paving the way towards larger-scaledevices where qubits canreliablystorequantum informationfor a long period of time inthepresence of noise. Even so, he admits that a universal fault-tolerant quantum computer, which has to use logical qubits, is still along way off. Such developments make Childs cautiously optimistic. Im sure well see improved experimental demonstrations of [error correction], but I think it will be quite a while before we see it used for a real computation, he said.

For the time being, quantum computers are going to be error-prone, and the question is how to live with that. At IBM, researchers are talking about approximate quantum computing as the way the field will look in the near term: finding ways of accommodating the noise.

This calls for algorithms that tolerate errors, getting the correct result despite them. Its a bit like working out the outcome of an election regardless of a few wrongly counted ballot papers. A sufficiently large and high-fidelity quantum computation should have some advantage [over a classical computation] even if it is not fully fault-tolerant, said Gambetta.

One of the most immediate error-tolerant applications seems likely to be of more value to scientists than to the world at large: to simulate stuff at the atomic level. (This, in fact, was the motivation that led Feynman to propose quantum computing in the first place.) The equations of quantum mechanics prescribe a way to calculate the properties such as stability and chemical reactivity of a molecule such as a drug. But they cant be solved classically without making lots of simplifications.

In contrast, the quantum behavior of electrons and atoms, said Childs, is relatively close to the native behavior of a quantum computer. So one could then construct an exact computer model of such a molecule. Many in the community, including me, believe that quantum chemistry and materials science will be one of the first usefulapplications of such devices, said Aspuru-Guzik, who has been at the forefront of efforts to push quantum computing in this direction.

Quantum simulations are proving their worth even on the very small quantum computers available so far. A team of researchers including Aspuru-Guzik has developed an algorithm that they call the variational quantum eigensolver (VQE), which can efficiently find the lowest-energy states of molecules even with noisy qubits. So far it can only handle very small molecules with few electrons, which classical computers can already simulate accurately. But the capabilities are getting better, as Gambetta and coworkers showed last September when they used a 6-qubit device at IBM to calculate the electronic structures of molecules, including lithium hydride and beryllium hydride. The work was a significant leap forward for the quantum regime, according to physical chemist Markus Reiher of the Swiss Federal Institute of Technology in Zurich, Switzerland. The use of the VQE for the simulation of small molecules is a great example of the possibility of near-term heuristic algorithms, said Gambetta.

But even for this application, Aspuru-Guzik confesses that logical qubits with error correction will probably be needed before quantum computers truly begin to surpass classical devices. I would be really excited when error-corrected quantum computing begins to become a reality, he said.

If we had more than 200 logical qubits, we could do things in quantum chemistry beyond standard approaches, Reiher adds. And if we had about 5,000 such qubits, then the quantum computer would be transformative in this field.

Despite the challenges of reaching those goals, the fast growth of quantum computers from 5 to 50 qubits in barely more than a year has raised hopes. But we shouldnt get too fixated on these numbers, because they tell only part of the story. What matters is not just or even mainly how many qubits you have, but how good they are, and how efficient your algorithms are.

Any quantum computation has to be completed before decoherence kicks in and scrambles the qubits. Typically, the groups of qubits assembled so far have decoherence times of a few microseconds. The number of logic operations you can carry out during that fleeting moment depends on how quickly the quantum gates can be switched if this time is too slow, it really doesnt matter how many qubits you have at your disposal. The number of gate operations needed for a calculation is called its depth: Low-depth (shallow) algorithms are more feasible than high-depth ones, but the question is whether they can be used to perform useful calculations.

Whats more, not all qubits are equally noisy. In theory it should be possible to make very low-noise qubits from so-called topological electronic states of certain materials, in which the shape of the electron states used for encoding binary information confers a kind of protection against random noise. Researchers at Microsoft, most prominently, are seeking such topological states in exotic quantum materials, but theres no guarantee that theyll be found or will be controllable.

Researchers at IBM have suggested that the power of a quantum computation on a given device be expressed as a number called the quantum volume, which bundles up all the relevant factors: number and connectivity of qubits, depth of algorithm, and other measures of the gate quality, such as noisiness. Its really this quantum volume that characterizes the power of a quantum computation, and Gambetta said that the best way forward right now is to develop quantum-computational hardware that increases the available quantum volume.

This is one reason why the much vaunted notion of quantum supremacy is more slippery than it seems. The image of a 50-qubit (or so) quantum computer outperforming a state-of-the-art supercomputer sounds alluring, but it leaves a lot of questions hanging. Outperforming for which problem? How do you know the quantum computer has got the right answer if you cant check it with a tried-and-tested classical device? And how can you be sure that the classical machine wouldnt do better if you could find the right algorithm?

So quantum supremacy is a concept to handle with care. Some researchers prefer now to talk about quantum advantage, which refers to the speedup that quantum devices offer without making definitive claims about what is best. An aversion to the word supremacy has also arisen because of the racial and political implications.

Whatever you choose to call it, a demonstration that quantum computers can do things beyond current classical means would be psychologically significant for the field. Demonstrating an unambiguous quantum advantage will be an important milestone, said Eisert it would prove that quantum computers really can extend what is technologically possible.

That might still be more of a symbolic gesture than a transformation in useful computing resources. But such things may matter, because if quantum computing is going to succeed, it wont be simply by the likes of IBM and Google suddenly offering their classy new machines for sale. Rather, itll happen through an interactive and perhaps messy collaboration between developers and users, and the skill set will evolve in the latter only if they have sufficient faith that the effort is worth it. This is why both IBM and Google are keen to make their devices available as soon as theyre ready. As well as a 16-qubit IBM Q experience offered to anyone who registers online, IBM now has a 20-qubit version for corporate clients, including JP Morgan Chase, Daimler, Honda, Samsung and the University of Oxford. Not only will that help clients discover whats in it for them; it should create a quantum-literate community of programmers who will devise resources and solve problems beyond what any individual company could muster.

For quantum computing to take traction and blossom, we must enable the world to use and to learn it, said Gambetta. This period is for the world of scientists and industry to focus on getting quantum-ready.

See original here:

The Era of Quantum Computing Is Here. Outlook: Cloudy …

The Era of Quantum Computing Is Here. Outlook: Cloudy …

A lot of research on the fundamentals of quantum computing has been devoted to error correction. Part of the difficulty stems from another of the key properties of quantum systems: Superpositions can only be sustained as long as you dont measure the qubits value. If you make a measurement, the superposition collapses to a definite value: 1 or 0. So how can you find out if a qubit has an error if you dont know what state it is in?

One ingenious scheme involves looking indirectly, by coupling the qubit to another ancilla qubit that doesnt take part in the calculation but that can be probed without collapsing the state of the main qubit itself. Its complicated to implement, though. Such solutions mean that, to construct a genuine logical qubit on which computation with error correction can be performed, you need many physical qubits.

How many? Quantum theorist Aln Aspuru-Guzik of Harvard University estimates that around 10,000 of todays physical qubits would be needed to make a single logical qubit a totally impractical number. If the qubits get much better, he said, this number could come down to a few thousand or even hundreds. Eisert is less pessimistic, saying that on the order of 800 physical qubits might already be enough, but even so he agrees that the overhead is heavy, and for the moment we need to find ways of coping with error-prone qubits.

An alternative to correcting errors is avoiding them or canceling out their influence: so-called error mitigation. Researchers at IBM, for example, are developing schemes for figuring out mathematically how much error is likely to have been incurred in a computation and then extrapolating the output of a computation to the zero noise limit.

Some researchers think that the problem of error correction will prove intractable and will prevent quantum computers from achieving the grand goals predicted for them. The task of creating quantum error-correcting codes is harder than the task of demonstrating quantum supremacy, said mathematician Gil Kalai of the Hebrew University of Jerusalem in Israel. And he adds that devices without error correction are computationally very primitive, and primitive-based supremacy is not possible. In other words, youll never do better than classical computers while youve still got errors.

Others believe the problem will be cracked eventually. According to Jay Gambetta, a quantum information scientist at IBMs Thomas J. Watson Research Center, Our recent experiments at IBM have demonstrated the basicelementsof quantum error correction onsmalldevices, paving the way towards larger-scaledevices where qubits canreliablystorequantum informationfor a long period of time inthepresence of noise. Even so, he admits that a universal fault-tolerant quantum computer, which has to use logical qubits, is still along way off. Such developments make Childs cautiously optimistic. Im sure well see improved experimental demonstrations of [error correction], but I think it will be quite a while before we see it used for a real computation, he said.

For the time being, quantum computers are going to be error-prone, and the question is how to live with that. At IBM, researchers are talking about approximate quantum computing as the way the field will look in the near term: finding ways of accommodating the noise.

This calls for algorithms that tolerate errors, getting the correct result despite them. Its a bit like working out the outcome of an election regardless of a few wrongly counted ballot papers. A sufficiently large and high-fidelity quantum computation should have some advantage [over a classical computation] even if it is not fully fault-tolerant, said Gambetta.

One of the most immediate error-tolerant applications seems likely to be of more value to scientists than to the world at large: to simulate stuff at the atomic level. (This, in fact, was the motivation that led Feynman to propose quantum computing in the first place.) The equations of quantum mechanics prescribe a way to calculate the properties such as stability and chemical reactivity of a molecule such as a drug. But they cant be solved classically without making lots of simplifications.

In contrast, the quantum behavior of electrons and atoms, said Childs, is relatively close to the native behavior of a quantum computer. So one could then construct an exact computer model of such a molecule. Many in the community, including me, believe that quantum chemistry and materials science will be one of the first usefulapplications of such devices, said Aspuru-Guzik, who has been at the forefront of efforts to push quantum computing in this direction.

Quantum simulations are proving their worth even on the very small quantum computers available so far. A team of researchers including Aspuru-Guzik has developed an algorithm that they call the variational quantum eigensolver (VQE), which can efficiently find the lowest-energy states of molecules even with noisy qubits. So far it can only handle very small molecules with few electrons, which classical computers can already simulate accurately. But the capabilities are getting better, as Gambetta and coworkers showed last September when they used a 6-qubit device at IBM to calculate the electronic structures of molecules, including lithium hydride and beryllium hydride. The work was a significant leap forward for the quantum regime, according to physical chemist Markus Reiher of the Swiss Federal Institute of Technology in Zurich, Switzerland. The use of the VQE for the simulation of small molecules is a great example of the possibility of near-term heuristic algorithms, said Gambetta.

But even for this application, Aspuru-Guzik confesses that logical qubits with error correction will probably be needed before quantum computers truly begin to surpass classical devices. I would be really excited when error-corrected quantum computing begins to become a reality, he said.

If we had more than 200 logical qubits, we could do things in quantum chemistry beyond standard approaches, Reiher adds. And if we had about 5,000 such qubits, then the quantum computer would be transformative in this field.

Despite the challenges of reaching those goals, the fast growth of quantum computers from 5 to 50 qubits in barely more than a year has raised hopes. But we shouldnt get too fixated on these numbers, because they tell only part of the story. What matters is not just or even mainly how many qubits you have, but how good they are, and how efficient your algorithms are.

Any quantum computation has to be completed before decoherence kicks in and scrambles the qubits. Typically, the groups of qubits assembled so far have decoherence times of a few microseconds. The number of logic operations you can carry out during that fleeting moment depends on how quickly the quantum gates can be switched if this time is too slow, it really doesnt matter how many qubits you have at your disposal. The number of gate operations needed for a calculation is called its depth: Low-depth (shallow) algorithms are more feasible than high-depth ones, but the question is whether they can be used to perform useful calculations.

Whats more, not all qubits are equally noisy. In theory it should be possible to make very low-noise qubits from so-called topological electronic states of certain materials, in which the shape of the electron states used for encoding binary information confers a kind of protection against random noise. Researchers at Microsoft, most prominently, are seeking such topological states in exotic quantum materials, but theres no guarantee that theyll be found or will be controllable.

Researchers at IBM have suggested that the power of a quantum computation on a given device be expressed as a number called the quantum volume, which bundles up all the relevant factors: number and connectivity of qubits, depth of algorithm, and other measures of the gate quality, such as noisiness. Its really this quantum volume that characterizes the power of a quantum computation, and Gambetta said that the best way forward right now is to develop quantum-computational hardware that increases the available quantum volume.

This is one reason why the much vaunted notion of quantum supremacy is more slippery than it seems. The image of a 50-qubit (or so) quantum computer outperforming a state-of-the-art supercomputer sounds alluring, but it leaves a lot of questions hanging. Outperforming for which problem? How do you know the quantum computer has got the right answer if you cant check it with a tried-and-tested classical device? And how can you be sure that the classical machine wouldnt do better if you could find the right algorithm?

So quantum supremacy is a concept to handle with care. Some researchers prefer now to talk about quantum advantage, which refers to the speedup that quantum devices offer without making definitive claims about what is best. An aversion to the word supremacy has also arisen because of the racial and political implications.

Whatever you choose to call it, a demonstration that quantum computers can do things beyond current classical means would be psychologically significant for the field. Demonstrating an unambiguous quantum advantage will be an important milestone, said Eisert it would prove that quantum computers really can extend what is technologically possible.

That might still be more of a symbolic gesture than a transformation in useful computing resources. But such things may matter, because if quantum computing is going to succeed, it wont be simply by the likes of IBM and Google suddenly offering their classy new machines for sale. Rather, itll happen through an interactive and perhaps messy collaboration between developers and users, and the skill set will evolve in the latter only if they have sufficient faith that the effort is worth it. This is why both IBM and Google are keen to make their devices available as soon as theyre ready. As well as a 16-qubit IBM Q experience offered to anyone who registers online, IBM now has a 20-qubit version for corporate clients, including JP Morgan Chase, Daimler, Honda, Samsung and the University of Oxford. Not only will that help clients discover whats in it for them; it should create a quantum-literate community of programmers who will devise resources and solve problems beyond what any individual company could muster.

For quantum computing to take traction and blossom, we must enable the world to use and to learn it, said Gambetta. This period is for the world of scientists and industry to focus on getting quantum-ready.

Excerpt from:

The Era of Quantum Computing Is Here. Outlook: Cloudy …

Quantum Computing | Centre for Quantum Computation and …

Anisotropic invariance and the distribution of quantum correlations Physical Review Letters 118, 010401 (2017)

Discrete fluctuations in memory erasure without energy cost Physical Review Letters 118, 060602 (2017)

Experimental test of photonic entanglement in accelerated reference frames Nature Communications 8, 15304 (2017)

Tuning quantum measurements to control chaos Scientific Reports 7, 44684 (2017)

A single-atom quantum memory in silicon Quantum Science and Technology 2, 15009 (2017)

Ultrafine entanglement witnessing Physical Review Letters 118, 110502 (2017)

Atomically engineered electron spin lifetimes of 30 seconds in silicon Science Advances 3, e1602811 (2017)

Environmentally mediated coherent control of a spin qubit in diamond Physical Review Letters 118, 167204 (2017)

Quantum imaging of current flow in graphene Science Advances 3, e1602429 (2017)

Achieving quantum supremacy with sparse and noisy commuting quantum computations Quantum 1, 8 (2017)

Measurement-based linear optics Physical Review Letters 118, 110503 (2017)

Experimental demonstration of nonbilocal quantum correlations Science Advances 3, e1602743 (2017)

Read more:

Quantum Computing | Centre for Quantum Computation and …

Silicon Quantum Computing launched to commercialise UNSW … – ZDNet

A new company dubbed Silicon Quantum Computing (SQC) has been launched to take advantage of and commercialise the work done by the University of New South Wales (UNSW) in the quantum space.

SQC will work out of new laboratories within the Centre for Quantum Computation and Communication Technology (CQC2T) at UNSW, and is slated to hire 40 staff members — made up in part by 25 post-doctoral researchers and 12 PhD students.

The board for SQC will consist of professor Michelle Simmons, who has been the driving force behind CQC2T; Telstra chief scientist Hugh Bradlow; Commonwealth Bank of Australia (CBA) CIO David Whiteing; and Secretary of the federal Department of Industry, Innovation and Science Glenys Beauchamp, with corporate lawyer Stephen Menzies to serve as its interim chair.

Announced on Wednesday as a new shareholder, but not taking a board seat, was the NSW government, which funded the company to the tune of AU$8.7 million from its Quantum Computing Fund.

The state government funding follows CBA investing AU$14 million, Telstra injecting AU$10 million, the federal government allocating AU$25 million over four years, and UNSW putting $25 million towards CQC2T.

SQC is targeting having a 10-qubit machine commercialised by 2022.

Menzies told ZDNet that the creation of the company would shorten the time to market by three years, and allow for a patent portfolio to be built. He said the company is seeking three more investors to fund it at similar levels to Telstra and CBA, and is currently on the hunt for a CEO.

“We will fund hardware, but from that we will develop a patent pool which we hope will be without peer in the world,” Menzies said during the launch.

“In the first five years, we are very focused, the business plan is focused on the patents associated with an engineered 10-qubit device. But beyond that, we see that we have a stage on which we develop across Australia and across Australian institutions, a broad quantum industry.”

Minister for Industry, Innovation and Science Arthur Sinodinos said quantum computing was important to the country’s future.

“Whatever sector of innovation, we want to be really good in, we need to be world beaters,” he said on Wednesday.

“We want to be able to create a competitive advantage, command a premium, and you do that by doing something new, something that others find it hard to replicate, or it takes them time to replicate and by the time they have replicated it, you’ve moved on to something else.”

Previously, Simmons said she believes the work completed by CQC2T to develop silicon-based qubits will win out in the race to a 30-qubit system.

“We do believe that silicon is the one that has longevity; it’s a manufacturable material, and it has some of the highest-quality qubits that are out there,” Simmons said in June.

“That’s why it’s very exciting for Australia. We actually believe this can go all the way, and we believe we can build it in Australia.”

Telstra chief scientist Bradlow reiterated on Wednesday that Telstra sees itself offering quantum computing as a service.

“I can assure you they are not going to walk in on day one and know how to use these things,” he said previously.

“We want to be able to offer it as-a-service to them … they will need a lot of hand holding, and they are not going to run the equipment themselves, it’s complicated.”

For its part, CBA is preparing for a quantum future by using a quantum computing simulator from QxBranch.

“The difference between the emulator of a quantum computer and the real hardware is that we run the simulator on classical computers, so we don’t get the benefit of the speed up that you get from quantum, but we can simulate its behaviour and some of the broad characteristics of what the eventual hardware will do,” QxBranch CEO Michael Brett told ZDNet in April.

“What we provide is the ability for people to explore and validate the applications of quantum computing so that as soon as the hardware is ready, they’ll be able to apply those applications and get the benefit immediately of the unique advantages of quantum computing.”

Here is the original post:

Silicon Quantum Computing launched to commercialise UNSW … – ZDNet

Hype and cash are muddying public understanding of quantum … – The Conversation AU

An ion-trap used for quantum computing research in the Quantum Control Laboratory at the University of Sydney.

Its no surprise that quantum computing has become a media obsession. A functional and useful quantum computer would represent one of the centurys most profound technical achievements.

For researchers like me, the excitement is welcome, but some claims appearing in popular outlets can be baffling.

A recent infusion of cash and attention from the tech giants has woken the interest of analysts, who are now eager to proclaim a breakthrough moment in the development of this extraordinary technology.

Quantum computing is described as just around the corner, simply awaiting the engineering prowess and entrepreneurial spirit of the tech sector to realise its full potential.

Whats the truth? Are we really just a few years away from having quantum computers that can break all online security systems? Now that the technology giants are engaged, do we sit back and wait for them to deliver? Is it now all just engineering?

Quantum computers are machines that use the rules of quantum physics in other words, the physics of very small things to encode and process information in new ways.

They exploit the unusual physics we find on these tiny scales, physics that defies our daily experience, in order to solve problems that are exceptionally challenging for classical computers. Dont just think of quantum computers as faster versions of todays computers think of them as computers that function in a totally new way. The two are as different as an abacus and a PC.

They can (in principle) solve hard, high-impact questions in fields such as codebreaking, search, chemistry and physics.

Read More: Quantum computers could crack existing codes but create others much harder to break

Chief among these is factoring: finding the two prime numbers, divisible only by one and themselves, which when multiplied together reach a target number. For instance, the prime factors of 15 are 3 and 5.

As simple as it looks, when the number to be factored becomes large, say 1,000 digits long, the problem is effectively impossible for a classical computer. The fact that this problem is so hard for any conventional computer is how we secure most internet communications, such as through public-key encryption.

Some quantum computers are known to perform factoring exponentially faster than any classical supercomputer. But competing with a supercomputer will still require a pretty sizeable quantum computer.

Quantum computing began as a unique discipline in the late 1990s when the US government, aware of the newly discovered potential of these machines for codebreaking, began investing in university research

The field drew together teams from all over the world, including Australia, where we now have two Centres of Excellence in quantum technology (the author is part of of the Centre of Excellence for Engineered Quantum Systems).

But the academic focus is now shifting, in part, to industry.

IBM has long had a basic research program in the field. It was recently joined by Google, who invested in a University of California team, and Microsoft, which has partnered with academics globally, including the University of Sydney.

Seemingly smelling blood in the water, Silicon Valley venture capitalists also recently began investing in new startups working to build quantum computers.

The media has mistakenly seen the entry of commercial players as the genesis of recent technological acceleration, rather than a response to these advances.

So now we find a variety of competing claims about the state of the art in the field, where the field is going, and who will get to the end goal a large-scale quantum computer first.

Conventional computer microprocessors can have more than one billion fundamental logic elements, known as transistors. In quantum systems, the fundamental quantum logic units are known as qubits, and for now, they mostly number in the range of a dozen.

Such devices are exceptionally exciting to researchers and represent huge progress, but they are little more than toys from a practical perspective. They are not near whats required for factoring or any other application theyre too small and suffer too many errors, despite what the frantic headlines may promise.

For instance, its not even easy to answer the question of which system has the best qubits right now.

Consider the two dominant technologies. Teams using trapped ions have qubits that are resistant to errors, but relatively slow. Teams using superconducting qubits (including IBM and Google) have relatively error-prone qubits that are much faster, and may be easier to replicate in the near term.

Which is better? Theres no straightforward answer. A quantum computer with many qubits that suffer from lots of errors is not necessarily more useful than a very small machine with very stable qubits.

Because quantum computers can also take different forms (general purpose versus tailored to one application), we cant even reach agreement on which system currently has the greatest set of capabilities.

Similarly, theres now seemingly endless competition over simplified metrics such as the number of qubits. Five, 16, soon 49! The question of whether a quantum computer is useful is defined by much more than this.

Theres been a media focus lately on achieving quantum supremacy. This is the point where a quantum computer outperforms its best classical counterpart, and reaching this would absolutely mark an important conceptual advance in quantum computing.

But dont confuse quantum supremacy with utility.

Some quantum computer researchers are seeking to devise slightly arcane problems that might allow quantum supremacy to be reached with, say, 50-100 qubits numbers reachable within the next several years.

Achieving quantum supremacy does not mean either that those machines will be useful, or that the path to large-scale machines will become clear.

Moreover, we still need to figure out how to deal with errors. Classical computers rarely suffer hardware faults the blue screen of death generally comes from software bugs, rather than hardware failures. The likelihood of hardware failure is usually less than something like one in a billion-quadrillion, or 10-24 in scientific notation.

The best quantum computer hardware, on the other hand, typically achieves only about one in 10,000, or 10-4. Thats 20 orders of magnitude worse.

Were seeing a slow creep up in the number of qubits in the most advanced systems, and clever scientists are thinking about problems that might be usefully addressed with small quantum computers containing just a few hundred qubits.

But we still face many fundamental questions about how to build, operate or even validate the performance of the large-scale systems we sometimes hear are just around the corner.

Read More: Compute this: the quantum future is crystal clear

As an example, if we built a fully error-corrected quantum computer at the scale of the millions of qubits required for useful factoring, as far as we can tell, it would represent a totally new state of matter. Thats pretty fundamental.

At this stage, theres no clear path to the millions of error-corrected qubits we believe are required to build a useful factoring machine. Current global efforts (in which this author is a participant) are seeking to build just one error-corrected qubit to be delivered about five years from now.

At the end of the day, none of the teams mentioned above are likely to build a useful quantum computer in 2017 or 2018. But that shouldnt cause concern when there are so many exciting questions to answer along the way.

More:

Hype and cash are muddying public understanding of quantum … – The Conversation AU

IEEE Approves Standards Project for Quantum Computing … – insideHPC

William Hurley is chair of IEEE Quantum Computing Working Group

Today IEEE announced the approval of the IEEE P7130Standard for Quantum Computing Definitions project. The new standards project aims to make Quantum Computing more accessible to a larger group of contributors, including developers of software and hardware, materials scientists, mathematicians, physicists, engineers, climate scientists, biologists and geneticists.

While Quantum Computing is poised for significant growth and advancement, the emergent industry is currently fragmented and lacks a common communications framework, said Whurley (William Hurley), chair, IEEE Quantum Computing Working Group. IEEE P7130 marks an important milestone in the development of Quantum Computing by building consensus on a nomenclature that will bring the benefits of standardization, reduce confusion, and foster a more broadly accepted understanding for all stakeholders involved in advancing technology and solutions in the space.

The purpose of this project is to provide a general nomenclature for Quantum Computing that may be used to standardize communication with related hardware, and software projects. This standard addresses quantum computing specific terminology and establishes definitions necessary to facilitate communication.

Confusions exist on what quantum computing or a quantum computer means, added Professor Hidetoshi Nishimori of the Tokyo Institute of Technology and IEEE P7130 working group participant. This partly originates in the existence of a few different models of quantum computing. It is urgently necessary to define each key word.

Sign up for our insideHPC Newsletter

Read the original post:

IEEE Approves Standards Project for Quantum Computing … – insideHPC

UNSW joins with government and business to keep quantum computing technology in Australia – The Australian Financial Review

UNSW quantum pioneer Michelle Simmons (right) with the chair of the new Silicon Quantum Computing company, Stephen Menzies.

Governments, business and universities have joined forces to keep UNSW’s world leading quantum computing technology in Australia, launching a new $83 million company which aims to produce a working prototype computer within five years.

The company, Silicon Quantum Computing Pty Ltd, will have a key goal of retaining IP in Australia and boosting new industries based around quantum computing and other quantum spin-offs.

Establishing the company has been a long-term goal of UNSW physics professor Michelle Simmons who leads the university’s research and development in the race to build the world’s first practical quantum computer.

Professor Simmons said she approached the federal government to urge public investment in quantum computing because of the many approaches she was getting from large multinationals and overseas venture capital for access to the discoveries her team had made.

“We had lots of different groupings come to us saying they would work with our research teams, but they would have got all the benefits,” she said.

“Everything we did would have gone to them. People were trying to pick us off.

“Personally I just felt complete responsibility for just not dropping the ball, making sure that this great thing that we had was not just siphoned off for free.”

Professor Simmons said it was an “eye-opener” for her that not only the IT industry was beating a path to her door, but companies from “across the board” illustrating her belief that quantum computing will have a revolutionary impact in many industries including finance, resource extraction, health, pharmaceuticals, logistics and data.

Quantum computers are expected to solve some types of problems millions of times faster than conventional computers.

The new company will hold the quantum computing related patents from the Centre of Excellence for Quantum Computation and Communication Technology, led by Professor Simmons, which also includes researchers from the University of Melbourne and other universities.

Its aim will be to ensure that the full range of industries developed from quantum computing including hardware, software, and big quantum server farms are developed in Australia.

Silicon Quantum Computing’s chair, lawyer Stephen Menzies, said the company would not offer exclusive rights on its technology but would only offer licences for specific purposes for a limited time.

“Too much Australian research innovation is lost [overseas],” he said.

Mr Menzies said it was a commercial venture, and its shareholders the federal and NSW governments, Telstra, the Commonwealth Bank of Australia and UNSW would profit from the increasing value of the company’s patents.

The company’s $83 million capital comes from UNSW ($25 million), the federal government ($25 million), the Commonwealth Bank ($14 million), Telstra ($10 million) plus a new investment of $8.7 million from the NSW government the first to be made from its $26 million quantum computing fund announced last month.

It will fund a major expansion of the quantum computing research effort at UNSW. Up to 40 new staff will be hired including 25 researchers and 12 PhD students, and new equipment to speed the development of a 10 qubit prototype computer by 2022.

Continued here:

UNSW joins with government and business to keep quantum computing technology in Australia – The Australian Financial Review