12345...


The reality of quantum computing could be just three years …

Quantum computing has moved out of the realm of theoretical physics and into the real world, but its potential and promise are still years away.

Onstage at TechCrunch Disrupt SF, a powerhouse in the world of quantum research and a young upstart in the field presented visions for the future of the industry that illustrated both how far the industry has come and how far the technology has to go.

For both Dario Gil, the chief operating officer of IBM Research and the companys vice president of artificial intelligence and quantum computing, and Chad Rigetti, a former IBM researcher who founded Rigetti Computing and serves as its chief executive, the moment that a quantum computer will be able to perform operations better than a classical computer is only three years away.

[Its] generating a solution that is better, faster or cheaper than you can do otherwise, said Rigetti.Quantum computing has moved out of a field of research into now an engineering discipline and an engineering enterprise.

Considering the more than 30 years that IBM has been researching the technology and the millions (or billions) that have been poured into developing it, even seeing an end of the road is a victory for researchers and technologists.

Achieving this goal, for all of the brainpower and research hours that have gone into it, is hardly academic.

The Chinese government is building a $10 billion National Laboratory for Quantum Information in Anhui province, which borders Shanghai and is slated to open in 2020. Meanwhile, the U.S. public research into quantum computing is running at around $200 million per year.

One of the reasons why governments, especially, are so interested in the technology is its potential to completely remake the cybersecurity landscape. Some technologists argue that quantum computers will have the potential to crack any type of encryption technology, opening up all of the networks in the world to potential hacking.

Of course, quantum computing is so much more than security. It will enable new ways of doing things we cant even imagine because we have never had this much pure compute power. Think about artificial and machine learning or drug development; any type of operation that is compute-intensive could benefit from the exponential increase in compute power that quantum computing will bring.

Security may be the Holy Grail for governments, but both Rigetti and Gil say that the industrial chemical business will be the first place where the potentially radical transformation of a market will appear first.

To understand quantum computing it helps to understand the principles of the physics behind it.

As Gil explained onstage (and on our site), quantum computing depends on the principles of superposition, entanglement and interference.

Superposition is the notion that physicists can observe multiple potential states of a particle. If you a flip a coin it is one or two states, said Gil. Meaning that theres a single outcome that can be observed. But if someone were to spin a coin, theyd see a number of potential outcomes.

Once youve got one particle thats being observed, you can add another and pair them thanks to a phenomenon called quantum entanglement. If you have two coins where each one can be in superpositions and then you can have measurements can be taken of the difference of both.

Finally, theres interference, where the two particles can be manipulated by an outside force to change them and create different outcomes.

In classical systems you have these bits of zeros and ones and the logical operations of the ands and the ors and the nots, said Gil. The classical computer is able to process the logical operations of bits expressed in zeros and ones.

In an algorithm you put the computer in a super positional state, Gil continued. You can take the amplitude and states and interfere them and the algorithm is the thing that interferes I can have many, many states representing different pieces of information and then i can interfere with it to get these data.

These operations are incredibly hard to sustain. In the early days of research into quantum computing the superconducting devices only had one nanosecond before a qubit transforms into a traditional bit of data. Those ranges have increased between 50 and 100 microseconds, which enabled IBM and Rigetti to open up their platforms to researchers and others to conduct experimentation (more on that later).

As one can imagine, dealing with quantum particles is a delicate business. So the computing operations have to be carefully controlled.At the base of the machine is what basically amounts to a huge freezer that maintains a temperature in the device of 15 millikelvin near absolute zero degrees and 180 times colder than the temperatures in interstellar space.

These qubits are very delicate, said Gil. Anything from the outside world can couple to it and destroy its state and one way to protect it is to cool it.

Wiring for the quantum computer is made of superconducting coaxial cables. The inputs to the computers are microwave pulses that manipulates the particles creating a signal that is then interpreted by the computers operators.

Those operators used to require a degree in quantum physics. But both IBM and Rigetti have been working on developing tools that can enable a relative newbie to use the tech.

Even as companies like IBM and Rigetti bring the cost of quantum computing down from tens of millions of dollars to roughly $1 million to $2 million, these tools likely will never become commodity hardware that a consumer buys to use as a personal computer.

Rather, as with most other computing these days, quantum computing power will be provided as a service to users.

Indeed, Rigetti announced onstage a new hybrid computing platform that can provide computing services to help the industry both reach quantum advantage that tipping point at which quantum is commercially viable and to enable industries to explore the technologies to acclimatize to the potential ways in which typical operations could be disrupted by it.

A user logs on to their own device and use our software development kit to write a quantum application, said Rigetti. That program is sent to a compiler and kicks off an optimization kit that runs on a quantum and classical computer This is the architecture thats needed to achieve quantum advantage.

Both IBM and Rigetti and a slew of other competitors are preparing users for accessing quantum computing opportunities on the cloud.

IBM has more than a million chips performing millions of quantum operations requested by users in over 100 countries around the world.

In a cloud-first era Im not sure the economic forces will be there that will drive us to develop the miniaturized environment in the laptop, Rigetti said. But the ramifications of the technologys commercialization will be felt by everyone, everywhere.

Quantum computing is going to change the world and its all going to come in our lifetime, whether thats two years or five years, he said. Quantum computing is going to redefine every industry and touch every market. Every major company will be involved in some capacity in that space.

View original post here:

The reality of quantum computing could be just three years …

The reality of quantum computing could be just three years …

Quantum computing has moved out of the realm of theoretical physics and into the real world, but its potential and promise are still years away.

Onstage at TechCrunch Disrupt SF, a powerhouse in the world of quantum research and a young upstart in the field presented visions for the future of the industry that illustrated both how far the industry has come and how far the technology has to go.

For both Dario Gil, the chief operating officer of IBM Research and the companys vice president of artificial intelligence and quantum computing, and Chad Rigetti, a former IBM researcher who founded Rigetti Computing and serves as its chief executive, the moment that a quantum computer will be able to perform operations better than a classical computer is only three years away.

[Its] generating a solution that is better, faster or cheaper than you can do otherwise, said Rigetti.Quantum computing has moved out of a field of research into now an engineering discipline and an engineering enterprise.

Considering the more than 30 years that IBM has been researching the technology and the millions (or billions) that have been poured into developing it, even seeing an end of the road is a victory for researchers and technologists.

Achieving this goal, for all of the brainpower and research hours that have gone into it, is hardly academic.

The Chinese government is building a $10 billion National Laboratory for Quantum Information in Anhui province, which borders Shanghai and is slated to open in 2020. Meanwhile, the U.S. public research into quantum computing is running at around $200 million per year.

One of the reasons why governments, especially, are so interested in the technology is its potential to completely remake the cybersecurity landscape. Some technologists argue that quantum computers will have the potential to crack any type of encryption technology, opening up all of the networks in the world to potential hacking.

Of course, quantum computing is so much more than security. It will enable new ways of doing things we cant even imagine because we have never had this much pure compute power. Think about artificial and machine learning or drug development; any type of operation that is compute-intensive could benefit from the exponential increase in compute power that quantum computing will bring.

Security may be the Holy Grail for governments, but both Rigetti and Gil say that the industrial chemical business will be the first place where the potentially radical transformation of a market will appear first.

To understand quantum computing it helps to understand the principles of the physics behind it.

As Gil explained onstage (and on our site), quantum computing depends on the principles of superposition, entanglement and interference.

Superposition is the notion that physicists can observe multiple potential states of a particle. If you a flip a coin it is one or two states, said Gil. Meaning that theres a single outcome that can be observed. But if someone were to spin a coin, theyd see a number of potential outcomes.

Once youve got one particle thats being observed, you can add another and pair them thanks to a phenomenon called quantum entanglement. If you have two coins where each one can be in superpositions and then you can have measurements can be taken of the difference of both.

Finally, theres interference, where the two particles can be manipulated by an outside force to change them and create different outcomes.

In classical systems you have these bits of zeros and ones and the logical operations of the ands and the ors and the nots, said Gil. The classical computer is able to process the logical operations of bits expressed in zeros and ones.

In an algorithm you put the computer in a super positional state, Gil continued. You can take the amplitude and states and interfere them and the algorithm is the thing that interferes I can have many, many states representing different pieces of information and then i can interfere with it to get these data.

These operations are incredibly hard to sustain. In the early days of research into quantum computing the superconducting devices only had one nanosecond before a qubit transforms into a traditional bit of data. Those ranges have increased between 50 and 100 microseconds, which enabled IBM and Rigetti to open up their platforms to researchers and others to conduct experimentation (more on that later).

As one can imagine, dealing with quantum particles is a delicate business. So the computing operations have to be carefully controlled.At the base of the machine is what basically amounts to a huge freezer that maintains a temperature in the device of 15 millikelvin near absolute zero degrees and 180 times colder than the temperatures in interstellar space.

These qubits are very delicate, said Gil. Anything from the outside world can couple to it and destroy its state and one way to protect it is to cool it.

Wiring for the quantum computer is made of superconducting coaxial cables. The inputs to the computers are microwave pulses that manipulates the particles creating a signal that is then interpreted by the computers operators.

Those operators used to require a degree in quantum physics. But both IBM and Rigetti have been working on developing tools that can enable a relative newbie to use the tech.

Even as companies like IBM and Rigetti bring the cost of quantum computing down from tens of millions of dollars to roughly $1 million to $2 million, these tools likely will never become commodity hardware that a consumer buys to use as a personal computer.

Rather, as with most other computing these days, quantum computing power will be provided as a service to users.

Indeed, Rigetti announced onstage a new hybrid computing platform that can provide computing services to help the industry both reach quantum advantage that tipping point at which quantum is commercially viable and to enable industries to explore the technologies to acclimatize to the potential ways in which typical operations could be disrupted by it.

A user logs on to their own device and use our software development kit to write a quantum application, said Rigetti. That program is sent to a compiler and kicks off an optimization kit that runs on a quantum and classical computer This is the architecture thats needed to achieve quantum advantage.

Both IBM and Rigetti and a slew of other competitors are preparing users for accessing quantum computing opportunities on the cloud.

IBM has more than a million chips performing millions of quantum operations requested by users in over 100 countries around the world.

In a cloud-first era Im not sure the economic forces will be there that will drive us to develop the miniaturized environment in the laptop, Rigetti said. But the ramifications of the technologys commercialization will be felt by everyone, everywhere.

Quantum computing is going to change the world and its all going to come in our lifetime, whether thats two years or five years, he said. Quantum computing is going to redefine every industry and touch every market. Every major company will be involved in some capacity in that space.

Go here to see the original:

The reality of quantum computing could be just three years …

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Big things happen when computers get smaller. Or faster. And quantum computing is about chasing perhaps the biggest performance boost in the history of technology. The basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales.

If the tech industry pulls off that, ahem, quantum leap, you wont be getting a quantum computer for your pocket. Dont start saving for an iPhone Q. We could, however, see significant improvements in many areas of science and technology, such as longer-lasting batteries for electric cars or advances in chemistry that reshape industries or enable new medical treatments. Quantum computers wont be able to do everything faster than conventional computers, but on some tricky problems they have advantages that would enable astounding progress.

Its not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real. The only thing for sure is that they are still many years away. Prototype quantum computing hardware is still embryonic. But powerfuland, for tech companies, profit-increasingcomputers powered by quantum physics have recently started to feel less hypothetical.

The cooling and support structure for one of IBM’s quantum computing chips (the tiny black square at the bottom of the image).

Amy Lombard

Thats because Google, IBM, and others have decided its time to invest heavily in the technology, which, in turn, has helped quantum computing earn a bullet point on the corporate strategy PowerPoint slides of big companies in areas such as finance, like JPMorgan, and aerospace, like Airbus. In 2017, venture investors plowed $241 million into startups working on quantum computing hardware or software worldwide, according to CB Insights. Thats triple the amount in the previous year.

Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, its now. Say Schrodingers superposition three times fast, and we can dive in.

The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didnt just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didnt really exist until they were observed.

Physicist Paul Benioff suggests quantum mechanics could be used for computation.

Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computers power to break widely used forms of encryption.

D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the companys technology really works.

Google teams up with NASA to fund a lab to try out D-Waves hardware.

Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.

If you find that baffling, youre in good company. A year before winning a Nobel for his contributions to quantum theory, Caltechs Richard Feynman remarked that nobody understands quantum mechanics. The way we experience the world just isnt compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of themincluding Feynmanbegan to wonder if quantum phenomena like subatomic particles’ dont look and I dont exist trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the worlds fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examplesat least among a very select slice of humanityinclude superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.

The looped cables connect the chip at the bottom of the structure to its control system.

Amy Lombard

You may have heard that a qubit in superposition is both 0 and 1 at the same time. Thats not quite true and also not quite falsetheres just no equivalent in Homo sapiens humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read outan operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.

It’s the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.

A quantum effect so unintuitive that Einstein dubbed it spooky action at a distance. When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

The holy grail of quantum computinga measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers arent well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.

For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grovers algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grovers and some other quantum algorithms, the bigger the initial problemor phonebookthe further behind a conventional computer is left in the digital dust.

The reason we dont have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processors power to correct its own errors, caused by misfiring qubits.

Recent excitement about quantum computing stems from progress in making qubits less flaky. Thats giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. Thats still far fewer than would be needed to do useful work with a quantum computerit would probably require at least thousandsbut as recently as 2016 those companies best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

Some large companies and governments have started treating quantum computing research like a raceperhaps fittingly its one where both the distance to the finish line and the prize for getting there are unknown.

Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. Its not quite clear what the first killer apps of quantum computing will be, or when they will appear. But theres a sense that whoever is first make these machines useful will gain big economic and national security advantages.

Copper structures conduct heat well and connect the apparatus to its cooling system.

Amy Lombard

Back in the world of right now, though, quantum processors are too simple to do practical work. Google is working to stage a demonstration known as quantum supremacy, in which a quantum processor would solve a carefully designed math problem beyond existing supercomputers. But that would be an historic scientific milestone, not proof quantum computing is ready to do real work.

As quantum computer prototypes get larger, the first practical use for them will probably be for chemistry simulations. Computer models of molecules and atoms are vital to the hunt for new drugs or materials. Yet conventional computers cant accurately simulate the behavior of atoms and electrons during chemical reactions. Why? Because that behavior is driven by quantum mechanics, the full complexity of which is too great for conventional machines. Daimler and Volkswagen have both started investigating quantum computing as a way to improve battery chemistry for electric vehicles. Microsoft says other uses could include designing new catalysts to make industrial processes less energy intensive, or even to pull carbon dioxide out of the atmosphere to mitigate climate change.

Quantum computers would also be a natural fit for code-breaking. Weve known since the 90s that they could zip through the math underpinning the encryption that secures online banking, flirting, and shopping. Quantum processors would need to be much more advanced to do this, but governments and companies are taking the threat seriously. The National Institute of Standards and Technology is in the process of evaluating new encryption systems that could be rolled out to quantum-proof the internet.

When cooled to operating temperature, the whole assembly is hidden inside this white insulated casing.

Amy Lombard

Tech companies such as Google are also betting that quantum computers can make artificial intelligence more powerful. Thats further in the future and less well mapped out than chemistry or code-breaking applications, but researchers argue they can figure out the details down the line as they play around with larger and larger quantum processors. One hope is that quantum computers could help machine-learning algorithms pick up complex tasks using many fewer than the millions of examples typically used to train AI systems today.

Despite all the superposition-like uncertainty about when the quantum computing era will really begin, big tech companies argue that programmers need to get ready now. Google, IBM, and Microsoft have all released open source tools to help coders familiarize themselves with writing programs for quantum hardware. IBM has even begun to offer online access to some of its quantum processors, so anyone can experiment with them. Long term, the big computing companies see themselves making money by charging corporations to access data centers packed with supercooled quantum processors.

Whats in it for the rest of us? Despite some definite drawbacks, the age of conventional computers has helped make life safer, richer, and more convenientmany of us are never more than five seconds away from a kitten video. The era of quantum computers should have similarly broad reaching, beneficial, and impossible to predict consequences. Bring on the qubits.

The Quantum Computing Factory Thats Taking on Google and IBMPeek inside the ultra-clean workshop of Rigetti Computing, a startup packed with PhDs wearing what look like space suits and gleaming steampunk-style machines studded with bolts. In a facility across the San Francisco Bay from Silicon Valley, Rigetti is building its own quantum processors, using similar technology to that used by IBM and Google.

Why JP Morgan, Daimler Are Testing Quantum Computers That Arent Useful YetWall Street has plenty of quantsmath wizards who hunt profits using equations. Now JP Morgan has quantum quants, a small team collaborating with IBM to figure out how to use the power of quantum algorithms to more accurately model financial risk. Useful quantum computers are still years away, but the bank and other big corporations say that the potential payoffs are so large that they need to seriously investigate quantum computing today.

The Era of Quantum Computing is Here. Outlook: CloudyCompanies working on quantum computer hardware like to say that the field has transitioned from the exploration and uncertainty of science into the more predictable realm of engineering. Yet while hardware has improved markedly in recent years, and investment is surging, there are still open scientific questions about the physics underlying quantum computing.

Quantum Computing Will Create Jobs. But Which Ones?You cant create a new industry without people to staff the jobs it creates. A Congressional bill called the National Quantum Initiative seeks to have the US government invest in training the next generation of quantum computer technicians, designers, and entrepreneurs.

Job One For Quantum Computers: Boost Artificial IntelligenceArtificial intelligence and quantum computing are two of Silicon Valleys favorite buzzwords. If they can be successfully combined, machines will get a lot smarter.

Loopholes and the Anti-Realism Of the Quantum WorldEven people who can follow the math of quantum mechanics find its implications for reality perplexing. This book excerpt explains why quantum physics undermines our understanding of reality with nary an equation in sight.

Quantum Computing is the Next Security Big Security RiskIn 1994, mathematician Peter Shor wrote an algorithm that would allow a quantum computer to pierce the encryption that today underpins online shopping and other digital. As quantum computers get closer to reality, congressman Will Hurd (R-Texas) argues the US needs to lead a global effort to deploy new forms of quantum-resistant encryption.

This guide was last updated on August 24, 2018.

Enjoyed this deep dive? Check out more WIRED Guides.

Read the original:

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

See more here:

Quantum Computing | D-Wave Systems

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

Read the original:

Quantum Computing | D-Wave Systems

How Quantum Computers Work

A quantum computer is a computer design which uses the principles of quantum physics to increase the computational power beyond what is attainable by a traditional computer. Quantum computers have been built on the small scale and work continues to upgrade them to more practical models.

Computers function by storing data in a binary number format, which result in a series of 1s & 0s retained in electronic components such as transistors. Each component of computer memory is called a bit and can be manipulated through the steps of Boolean logic so that the bits change, based upon the algorithms applied by the computer program, between the 1 and 0 modes (sometimes referred to as “on” and “off”).

A quantum computer, on the other hand, would store information as either a 1, 0, or a quantum superposition of the two states. Such a “quantum bit” allows for far greater flexibility than the binary system.

Specifically, a quantum computer would be able to perform calculations on a far greater order of magnitude than traditional computers … a concept which has serious concerns and applications in the realm of cryptography & encryption. Some fear that a successful & practical quantum computer would devastate the world’s financial system by ripping through their computer security encryptions, which are based on factoring large numbers that literally cannot be cracked by traditional computers within the lifespan of the universe. A quantum computer, on the other hand, could factor the numbers in a reasonable period of time.

To understand how this speeds things up, consider this example. If the qubit is in a superposition of the 1 state and the 0 state, and it performed a calculation with another qubit in the same superposition, then one calculation actually obtains 4 results: a 1/1 result, a 1/0 result, a 0/1 result, and a 0/0 result. This is a result of the mathematics applied to a quantum system when in a state of decoherence, which lasts while it is in a superposition of states until it collapses down into one state. The ability of a quantum computer to perform multiple computations simultaneously (or in parallel, in computer terms) is called quantum parallelism).

The exact physical mechanism at work within the quantum computer is somewhat theoretically complex and intuitively disturbing. Generally, it is explained in terms of the multi-world interpretation of quantum physics, wherein the computer performs calculations not only in our universe but also in other universes simultaneously, while the various qubits are in a state of quantum decoherence. (While this sounds far-fetched, the multi-world interpretation has been shown to make predictions which match experimental results. Other physicists have )

Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. (This speech is also generally considered the starting point of nanotechnology.)

Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman’s suggestions into reality.

In 1985, the idea of “quantum logic gates” was put forth by University of Oxford’s David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch’s paper on the subject showed that any physical process could be modeled by a quantum computer.

Nearly a decade later, in 1994, AT&T’s Peter Shor devised an algorithm that could use only 6 qubits to perform some basic factorizations … more cubits the more complex the numbers requiring factorization became, of course.

A handful of quantum computers has been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps does show that the fundamental theory is sound.

The quantum computer’s main drawback is the same as its strength: quantum decoherence. The qubit calculations are performed while the quantum wave function is in a state of superposition between states, which is what allows it to perform the calculations using both 1 & 0 states simultaneously.

However, when a measurement of any type is made to a quantum system, decoherence breaks down and the wave function collapses into a single state. Therefore, the computer has to somehow continue making these calculations without having any measurements made until the proper time, when it can then drop out of the quantum state, have a measurement taken to read its result, which then gets passed on to the rest of the system.

The physical requirements of manipulating a system on this scale are considerable, touching on the realms of superconductors, nanotechnology, and quantum electronics, as well as others. Each of these is itself a sophisticated field which is still being fully developed, so trying to merge them all together into a functional quantum computer is a task which I don’t particularly envy anyone … except for the person who finally succeeds.

View original post here:

How Quantum Computers Work

Quantum Computing: Josef Gruska … – amazon.com

‘).appendTo(flyout.elem());var panelGroup=flyout.getName()+’SubCats’;var hideTimeout=null;var sloppyTrigger=createSloppyTrigger($parent);var showParent=function(){if(hideTimeout){clearTimeout(hideTimeout);hideTimeout=null;} if(visible){return;} var height=$(‘#nav-flyout-shopAll’).height(); $parent.css({‘height’: height});$parent.animate({width:’show’},{duration:200,complete:function(){$parent.css({overflow:’visible’});}});visible=true;};var hideParentNow=function(){$parent.stop().css({overflow:’hidden’,display:’none’,width:’auto’,height:’auto’});panels.hideAll({group:panelGroup});visible=false;if(hideTimeout){clearTimeout(hideTimeout);hideTimeout=null;}};var hideParent=function(){if(!visible){return;} if(hideTimeout){clearTimeout(hideTimeout);hideTimeout=null;} hideTimeout=setTimeout(hideParentNow,10);};flyout.onHide(function(){sloppyTrigger.disable();hideParentNow();this.elem().hide();});var addPanel=function($link,panelKey){var panel=dataPanel({className:’nav-subcat’,dataKey:panelKey,groups:[panelGroup],spinner:false,visible:false});if(!flyoutDebug){var mouseout=mouseOutUtility();mouseout.add(flyout.elem());mouseout.action(function(){panel.hide();});mouseout.enable();} var a11y=a11yHandler({link:$link,onEscape:function(){panel.hide();$link.focus();}});var logPanelInteraction=function(promoID,wlTriggers){var logNow=$F.once().on(function(){var panelEvent=$.extend({},event,{id:promoID});if(config.browsePromos&&!!config.browsePromos[promoID]){panelEvent.bp=1;} logEvent(panelEvent);phoneHome.trigger(wlTriggers);});if(panel.isVisible()&&panel.hasInteracted()){logNow();}else{panel.onInteract(logNow);}};panel.onData(function(data){renderPromo(data.promoID,panel.elem());logPanelInteraction(data.promoID,data.wlTriggers);});panel.onShow(function(){var columnCount=$(‘.nav-column’,panel.elem()).length;panel.elem().addClass(‘nav-colcount-‘+columnCount);showParent();var $subCatLinks=$(‘.nav-subcat-links > a’,panel.elem());var length=$subCatLinks.length;if(length>0){var firstElementLeftPos=$subCatLinks.eq(0).offset().left;for(var i=1;i’+ catTitle+”);panel.elem().prepend($subPanelTitle);}} $link.addClass(‘nav-active’);});panel.onHide(function(){$link.removeClass(‘nav-active’);hideParent();a11y.disable();sloppyTrigger.disable();});panel.onShow(function(){a11y.elems($(‘a, area’,panel.elem()));});sloppyTrigger.register($link,panel);if(flyoutDebug){$link.click(function(){if(panel.isVisible()){panel.hide();}else{panel.show();}});} var panelKeyHandler=onKey($link,function(){if(this.isEnter()||this.isSpace()){panel.show();}},’keydown’,false);$link.focus(function(){panelKeyHandler.bind();}).blur(function(){panelKeyHandler.unbind();});panel.elem().appendTo($parent);};var hideParentAndResetTrigger=function(){hideParent();sloppyTrigger.disable();};for(var i=0;i

Enter your mobile number or email address below and we’ll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer – no Kindle device required.

Flip to back Flip to front

Go here to see the original:

Quantum Computing: Josef Gruska … – amazon.com

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Read the rest here:

What are quantum computers and how do they work? WIRED …

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Original post:

What are quantum computers and how do they work? WIRED …

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

Read more:

Quantum Computing | D-Wave Systems

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

More here:

What are quantum computers and how do they work? WIRED …

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: March 9, 2018.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there’s more number-crunching ability in a 21st-century cellphone than you’d have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world’s most powerfulcomputersand there’s no guarantee we’ll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey’ll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today’s, we’ll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the “classical,” sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let’s take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it’s much moreand much lessthan that. It’s more, because it’s a completely general-purposemachine: you can make it do virtually anything you like. It’sless, because inside it’s little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer’s keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it’s on, we can use a transistor to store a number one(1); if it’s off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat’s the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it’s been doing so eversince. This apparently unshakeable trend is known as Moore’s Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That’s roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we’re talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you’re being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they’ll need to take and the longerthey’ll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore’s Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we’re getting to the point where the laws of physics seem likelyto put a stop to Moore’s Law. Unfortunately, there are still hugelydifficult computing problems we can’t tackle because even the mostpowerful computers find them intractable. That’s one of the reasonswhy people are now getting interested in quantum computing.

Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.

Richard Feynman

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that’s not true: on the atomic scale, the rules change and the “classical” laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: “Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.” (Six Easy Pieces, p116.)

If you’ve studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it’s made up of particles (like a steady stream ofcannonballs), and sometimes as though it’s waves of energy ripplingthrough space (a bit like waves on the sea). That’s called wave-particle dualityand it’s one of the ideas that comes to us from quantum theory. It’s hard to grasp thatsomething can be two things at oncea particle and awavebecause it’s totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that’s just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger’s cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore’s Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can’t. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer’s work, Bennett showed how a computer could circumventthis problem by working in a “reversible” way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University’s David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger’s cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you’re playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they’re added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it’s actually in at any given moment(by measuring it, in other words) does it “collapse” into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer’s ability to work in parallel would make it millions of times faster thanany conventional computer… if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons), or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN. Now you wouldn’t be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it’s locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don’t worry if you don’t understand; not many people do. Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that’s by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the “prime factors” of a large number, whichwould speed up the problem enormously. Shor’s algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan “intractable” computer problem). If quantum computers couldindeed factor large numbers quickly, today’s online security could berendered obsolete at a stroke. But what goes around comes around,and some researchers believe quantum technology will lead tomuch stronger forms of encryption.(In 2017, Chinese researchers demonstrated for the first timehow quantum encryption could be used to make a very secure video callfrom Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor’s algorithm, and a search method called Grover’s algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there’s been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM’sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine; the announcement proved highly controversialand there was a lot of debate over whether the company’s machines had really demonstrated quantum behavior.Three years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave’s approach.In March 2015, the Google team announced they were “a step closer to quantum computation,” having developeda new way for qubits to detect and protect against errors.In 2016, MIT’s Isaac Chuang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster.

There’s no doubt that these are hugely important advances.and the signs are growing steadily more encouraging that quantumtechnology will eventually deliver a computing revolution.In December 2017, Microsoft unveiled a completequantum development kit, including a new computer language, Q#, developed specifically forquantum applications. In early 2018,D-wave announced plans to start rolling out quantum power to acloud computing platform.A few weeks later, Google announced Bristlecone, a quantum processorbased on a 72-qubit array, that might, one day, form the cornerstone of a quantum computer that could tackle real-world problems.All very exciting! Even so, it’s early days for the whole field, and mostresearchers agree that we’re unlikely to see practical quantumcomputers appearing for some yearsand more likely several decades.

Read the original here:

Quantum computing: A simple introduction – Explain that Stuff

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

The rest is here:

What are quantum computers and how do they work? WIRED …

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

See more here:

Quantum Computing | D-Wave Systems

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Read this article:

What are quantum computers and how do they work? WIRED …

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: March 9, 2018.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there’s more number-crunching ability in a 21st-century cellphone than you’d have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world’s most powerfulcomputersand there’s no guarantee we’ll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey’ll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today’s, we’ll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the “classical,” sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let’s take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it’s much moreand much lessthan that. It’s more, because it’s a completely general-purposemachine: you can make it do virtually anything you like. It’sless, because inside it’s little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer’s keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it’s on, we can use a transistor to store a number one(1); if it’s off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat’s the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it’s been doing so eversince. This apparently unshakeable trend is known as Moore’s Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That’s roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we’re talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you’re being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they’ll need to take and the longerthey’ll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore’s Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we’re getting to the point where the laws of physics seem likelyto put a stop to Moore’s Law. Unfortunately, there are still hugelydifficult computing problems we can’t tackle because even the mostpowerful computers find them intractable. That’s one of the reasonswhy people are now getting interested in quantum computing.

Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.

Richard Feynman

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that’s not true: on the atomic scale, the rules change and the “classical” laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: “Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.” (Six Easy Pieces, p116.)

If you’ve studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it’s made up of particles (like a steady stream ofcannonballs), and sometimes as though it’s waves of energy ripplingthrough space (a bit like waves on the sea). That’s called wave-particle dualityand it’s one of the ideas that comes to us from quantum theory. It’s hard to grasp thatsomething can be two things at oncea particle and awavebecause it’s totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that’s just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger’s cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore’s Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can’t. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer’s work, Bennett showed how a computer could circumventthis problem by working in a “reversible” way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University’s David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger’s cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you’re playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they’re added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it’s actually in at any given moment(by measuring it, in other words) does it “collapse” into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer’s ability to work in parallel would make it millions of times faster thanany conventional computer… if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons), or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN. Now you wouldn’t be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it’s locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don’t worry if you don’t understand; not many people do. Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that’s by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the “prime factors” of a large number, whichwould speed up the problem enormously. Shor’s algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan “intractable” computer problem). If quantum computers couldindeed factor large numbers quickly, today’s online security could berendered obsolete at a stroke. But what goes around comes around,and some researchers believe quantum technology will lead tomuch stronger forms of encryption.(In 2017, Chinese researchers demonstrated for the first timehow quantum encryption could be used to make a very secure video callfrom Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor’s algorithm, and a search method called Grover’s algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there’s been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM’sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine; the announcement proved highly controversialand there was a lot of debate over whether the company’s machines had really demonstrated quantum behavior.Three years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave’s approach.In March 2015, the Google team announced they were “a step closer to quantum computation,” having developeda new way for qubits to detect and protect against errors.In 2016, MIT’s Isaac Chuang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster.

There’s no doubt that these are hugely important advances.and the signs are growing steadily more encouraging that quantumtechnology will eventually deliver a computing revolution.In December 2017, Microsoft unveiled a completequantum development kit, including a new computer language, Q#, developed specifically forquantum applications. In early 2018,D-wave announced plans to start rolling out quantum power to acloud computing platform.A few weeks later, Google announced Bristlecone, a quantum processorbased on a 72-qubit array, that might, one day, form the cornerstone of a quantum computer that could tackle real-world problems.All very exciting! Even so, it’s early days for the whole field, and mostresearchers agree that we’re unlikely to see practical quantumcomputers appearing for some yearsand more likely several decades.

Read more:

Quantum computing: A simple introduction – Explain that Stuff

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Original post:

What are quantum computers and how do they work? WIRED …

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: March 9, 2018.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there’s more number-crunching ability in a 21st-century cellphone than you’d have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world’s most powerfulcomputersand there’s no guarantee we’ll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey’ll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today’s, we’ll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the “classical,” sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let’s take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it’s much moreand much lessthan that. It’s more, because it’s a completely general-purposemachine: you can make it do virtually anything you like. It’sless, because inside it’s little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer’s keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it’s on, we can use a transistor to store a number one(1); if it’s off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat’s the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it’s been doing so eversince. This apparently unshakeable trend is known as Moore’s Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That’s roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we’re talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you’re being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they’ll need to take and the longerthey’ll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore’s Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we’re getting to the point where the laws of physics seem likelyto put a stop to Moore’s Law. Unfortunately, there are still hugelydifficult computing problems we can’t tackle because even the mostpowerful computers find them intractable. That’s one of the reasonswhy people are now getting interested in quantum computing.

Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.

Richard Feynman

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that’s not true: on the atomic scale, the rules change and the “classical” laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: “Things on a very small scale behave like nothing you have any direct experience about… or like anything that you have ever seen.” (Six Easy Pieces, p116.)

If you’ve studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it’s made up of particles (like a steady stream ofcannonballs), and sometimes as though it’s waves of energy ripplingthrough space (a bit like waves on the sea). That’s called wave-particle dualityand it’s one of the ideas that comes to us from quantum theory. It’s hard to grasp thatsomething can be two things at oncea particle and awavebecause it’s totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that’s just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger’s cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore’s Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can’t. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer’s work, Bennett showed how a computer could circumventthis problem by working in a “reversible” way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University’s David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger’s cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you’re playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they’re added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it’s actually in at any given moment(by measuring it, in other words) does it “collapse” into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer’s ability to work in parallel would make it millions of times faster thanany conventional computer… if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons), or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN. Now you wouldn’t be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it’s locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don’t worry if you don’t understand; not many people do. Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that’s by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the “prime factors” of a large number, whichwould speed up the problem enormously. Shor’s algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan “intractable” computer problem). If quantum computers couldindeed factor large numbers quickly, today’s online security could berendered obsolete at a stroke. But what goes around comes around,and some researchers believe quantum technology will lead tomuch stronger forms of encryption.(In 2017, Chinese researchers demonstrated for the first timehow quantum encryption could be used to make a very secure video callfrom Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor’s algorithm, and a search method called Grover’s algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there’s been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM’sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine; the announcement proved highly controversialand there was a lot of debate over whether the company’s machines had really demonstrated quantum behavior.Three years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave’s approach.In March 2015, the Google team announced they were “a step closer to quantum computation,” having developeda new way for qubits to detect and protect against errors.In 2016, MIT’s Isaac Chuang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster.

There’s no doubt that these are hugely important advances.and the signs are growing steadily more encouraging that quantumtechnology will eventually deliver a computing revolution.In December 2017, Microsoft unveiled a completequantum development kit, including a new computer language, Q#, developed specifically forquantum applications. In early 2018,D-wave announced plans to start rolling out quantum power to acloud computing platform.A few weeks later, Google announced Bristlecone, a quantum processorbased on a 72-qubit array, that might, one day, form the cornerstone of a quantum computer that could tackle real-world problems.All very exciting! Even so, it’s early days for the whole field, and mostresearchers agree that we’re unlikely to see practical quantumcomputers appearing for some yearsand more likely several decades.

Link:

Quantum computing: A simple introduction – Explain that Stuff

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can’t even scratch the surface of.

But the quantum future isn’t going to come easily, and there’s no knowing what it’ll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here’s everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

“The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called ‘entangled states’,” says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

View post:

What are quantum computers and how do they work? WIRED …

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

Original post:

Quantum Computing | D-Wave Systems


12345...