Experts Gather at Fermilab for International Workshop on Cryogenic Electronics for Quantum Systems – Quantaneo, the Quantum Computing Source

Leaders in quantum science converged this summer at Fermilab for the worlds first workshop on cryogenic electronics for quantum systems. As these fields are highly competitive, the hosts worked hard to attract key global allies and leaders in the field.

Scientists and engineers from academia and industry discussed the challenges of designing electronics for processors and sensors that will work in the ultracold environment.

Its a fundamental problem facing the field of quantum computing, which holds immense possibility across multiple disciplines. Experts say that quantum computers could someday be powerful enough to solve problems that are impossible for classical computers, potentially redefining how we see the world.

And much of it rides on designing electronics that are up to the task.

Quantum systems wont exist without cryogenic electronics, said Fermilab engineer Farah Fahim, workshop co-organizer and deputy head of quantum science at Fermilab. Thats why our community needs to collaborate, and why were working to establish key partnerships with academia and industry, as well as manufacturing companies that would support the fabrication of cold chips.

Researchers across multiple sectors have called for collaboration, and pioneers in the field turned out for the meeting. They included Edoardo Charbon (also workshop co-organizer) of the Advanced Quantum Architecture Lab at the Swiss Federal Institute of Technology Lausanne, or EPFL, in Switzerland and Andrew Dzurak of the University of New South Wales and Australia, a trailblazer in the field of silicon-based qubits who gave the workshops keynote address. Representatives from IBM, Intel, Global Foundries, Google and Microsoft also attended.

The Fermilab cryoelectronics workshop is a very important first step for the quantum computing community, said Malcolm Carroll, research staff member at IBM Research. Developing supporting electronics for future quantum computers is one of the next big hurdles. IBM looks forward to this series continuing and contributing to it as it has for this first one.

The global cooling effort centers on accommodating the qubit the fundamental unit of a quantum computers processor. Qubit information needs the extreme cold to survive below 15 millikelvins since any thermal energy can disturb the quantum computing operation.

The core of any quantum-technology-based system is a very special and carefully designed electronics optimized for deep cryogenic temperatures. This is a brave new world for us electronics engineers.

Current state-of-the-art systems use tens of qubits. But a quantum computer that surpasses the capabilities of todays classical computer would in certain cases require millions or billions of qubits, each of which needs electronics, both to control the state of the qubit and to read out its signals.

And electronics means cables.

As the system scales up, one bottleneck has been getting information out of the qubits and controlling the qubits themselves, Fahim said. It requires large numbers of wires.

For larger systems, the qubits and the electronics need to be closely integrated. Otherwise, information can become degraded as it winds its way down lengthy wires to bulky systems. With tight integration, the electronics can deliver the fast, self-correcting feedback required to control the qubit state on the order of ten-billionths of a second.

When you have the number of wires and cables required for a million- or billion-qubit system, close integration isnt possible unless your electronics can operate in the cold, side-by-side with the qubit.

Fermilab engineer Farah Fahim, left, and Edoardo Charbon of the Advanced Quantum Architecture Lab at EPFL co-organized the worlds first workshop on cryogenic electronics for quantum systems. Photo: Davide Braga

When you have lots of cables, after some point, you cant expand in that direction anymore. You cant integrate a million cold qubits with warm electronics, Fahim said. To scale up, cryogenic electronics is the only way to go. To be able to take it to the next level of integration, we need to move the room temperature control to cryogenic control. You want to be able to change the technology to meet the requirements.

When the electronics live in the same space the same refrigerated space as the qubits, the system becomes practical and manageable, capable of providing accurate, real-time qubit control.

That is the challenge the workshop attendees took head-on: developing quantum-system electronics that dont mind being left in the cold.

Developments in cold electronics may hold the keys to scaling up quantum computing, said Microsoft Quantum Sydney Director David Reilly, also a professor at the University of Sydney. As the community moves from the demonstration of single-qubit prototypes to scaled up machines that can address real problems, interest in this field is really taking off. Fermilab has deep expertise in cold electronics as well as a culture of filling the gap between academia and industry. Its only fitting that the first workshop on this topic was at Fermilab and I expect to see many more as government labs become pivotal players in the quantum ecosystem.

Experts dream of a day when quantum computers can get out of the cold and sit comfortably atop your desk just like your current PC.

We would like to reach a stage where nothing is cryocooled, but until we get there, the only way we get there is with electronics operating at very low temperatures, Fahim said.

The workshop was a major, international step in that direction.

Quantum technologies are the next frontier for many fields, including electronics. While quantum computers are certainly the pinnacle of such worldwide effort, many other applications are emerging, like quantum imaging, quantum sensing, quantum communications, quantum metrology, to name just a few, Charbon said. But the core of any quantum-technology-based system is a very special and carefully designed electronics optimized for deep cryogenic temperatures. This is a brave new world for us electronics engineers.

To continue the dialogue on this key enabling technology, the second International Workshop on Cryogenic Electronics for Quantum Systems will be held in Neuchatel, Switzerland in 2020.

This work is supported by the DOE Office of Science.

Read the original post:

Experts Gather at Fermilab for International Workshop on Cryogenic Electronics for Quantum Systems - Quantaneo, the Quantum Computing Source

The five pillars of Edge Computing — and what is Edge computing anyway? – Information Age

Why do we need Edge computing? What is it? What are the advantages? The five pillars to edge computing provide the answers.

The five pillars of Edge computing: latency, bandwidth, computing power, resilience and privacy,

It partly boils down to the explosion of devices. Joe Root, co-founder of Edge Computing Company, Permutive, which provides a data management platform built for purpose, told Information Age that, over the past 10 to 20 years, weve seen this explosion in the number of devices. So weve gone from 500 million to 15 billion devices, but over the next 10 years will go from 15 billion to a trillion devices.

These devices generate enormous volumes of data anything from my watch, which knows my heart rate constantly, all day long, all the way through to smart cities and factories, the data is being centralised on the cloud. This explosion of data causes problems for the way in which we currently process data.

So, Edge computing helps solve this, by providing computational power, in the form of local computing power, such as smart phones, on the edge.

Joe spoke to us about what he calls the five pillars of Edge Computing.

The cloud is at a distance. It takes time for data to be transferred from the cloud to the point where you need it. If you make a network request, it takes time to download the information you need. At the moment, says Root, this is around 100 milliseconds. This will come down with 5G, but no matter how advanced the technology there will always be a time lag involved in pulling data from the cloud defined by the speed of light. The closer you are to the data, the less the latency. Sure, the speed of light might be 300 million metres a second, or a metre every 3 nanoseconds, but when you are processing millions of pieces of data, pulling it from sources that may be many miles away, those nanoseconds add up. If you need to make a decision in milliseconds, actually, physics prevents that from being possible, explained Root. I think were seeing companies approach this in different ways if you get the CDN (content delivery network) or if you have 5G, then theyre trained to move the processing closer to the user to minimise that network latency. The most extreme way to do that though is to do it on the device. So that is why Apple face ID, for example, doesnt rely on the cloud, because milliseconds matter in that example.

All large organisations require true data leadership to succeed today. Increasingly, that means using automation tools like AI and machine learning

The second pillar is the limitation imposed on the amount of data you can access rapidly from the cloud imposed by bandwidth. Due to limitations of physics, you can only send a certain amount of data from the network before it sees the bandwidth limit and you cant send it. So, the second benefit of edge computing is that you arent constrained by the bandwidth, because youre processing the data on the device, you dont need to transfer it.

Whether you think 5G is hype or not, Gartner forecasts that worldwide 5G network infrastructure revenue will reach $4.2 billion in 2020

The explosion in the number of devices on the internet comes into play here. There is enormous computing power residing on the internet, which has already been funded. Furthermore, much of this computational power is under-utilised. Edge computing takes advantage of that processing power, meaning you dont have to pay for this computation in the cloud.

One day, quantum computers may change this relationship the computing power in the cloud will exceed that which exists on the edge. But that is some way off. In any case, even in the distant time when quantum computing provides superior processing power on the cloud, this is just one of the five pillars of Edge Computing.

Organisations are beginning to realise that they need unwavering control over every aspect of their business to drive digital transformation. How they do this? At the edge

What happens if the cloud goes down, or you lose your connection? It is something we all intuitively understand we may download a document to read on our computer or smart phone, so that if we are travelling and internet connection is intermittent, we can still read the document.

When data is stored on the cloud, we have less control over it. Our wearable health checker may record all kinds of information about us, but its our personal information, which we want to keep private and storing it locally and not allowing this data to seep onto the cloud is a good way to achieve this.

Indeed Root speculates that the greatest breach of all time is ongoing open RTB, (real time bidding), a protocol concerning programmatic advertising, in which advertisers can bid to have access to data about you. GDPR is shining a light on this, but from the point of view of individuals concerned about their privacy, Edge Computing could overcome this problem in one foul swoop. It doesnt mean advertisers will no longer be able to target specific customers accurately, but such targeting will be permission based and transparent.

Root claims that this is a data breach that is happening millions of times a second, right now.

Digital self defense does not mean MI5 agents. Nor are those who practice it unified by a single ideology. Theyre fed up with the pervasive online behavioural tracking that now follows them into their offline lives.

The advantages of cloud computing are well known you can turn your cloud computing up and down when you need it no need for startups, for example, to invest in expensive IT infrastructure you pay as you go.

But the limitations of the cloud are well known too latency, bandwidth, diluted processing power, resilience in the event of lost connection and privacy.

Edge computing, by taking advantage of hardware that has already been funded, can overcome many of those disadvantages without necessarily losing the flexibility of the cloud.

Its not that Edge Computing makes the cloud redundant but it does make it, well, and sincere apologies for the obvious pun, it makes it more edgy.

Read more:

The five pillars of Edge Computing -- and what is Edge computing anyway? - Information Age

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Big things happen when computers get smaller. Or faster. And quantum computing is about chasing perhaps the biggest performance boost in the history of technology. The basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales.

If the tech industry pulls off that, ahem, quantum leap, you wont be getting a quantum computer for your pocket. Dont start saving for an iPhone Q. We could, however, see significant improvements in many areas of science and technology, such as longer-lasting batteries for electric cars or advances in chemistry that reshape industries or enable new medical treatments. Quantum computers wont be able to do everything faster than conventional computers, but on some tricky problems they have advantages that would enable astounding progress.

Its not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real. The only thing for sure is that they are still many years away. Prototype quantum computing hardware is still embryonic. But powerfuland, for tech companies, profit-increasingcomputers powered by quantum physics have recently started to feel less hypothetical.

The cooling and support structure for one of IBM's quantum computing chips (the tiny black square at the bottom of the image).

Amy Lombard

Thats because Google, IBM, and others have decided its time to invest heavily in the technology, which, in turn, has helped quantum computing earn a bullet point on the corporate strategy PowerPoint slides of big companies in areas such as finance, like JPMorgan, and aerospace, like Airbus. In 2017, venture investors plowed $241 million into startups working on quantum computing hardware or software worldwide, according to CB Insights. Thats triple the amount in the previous year.

Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, its now. Say Schrodingers superposition three times fast, and we can dive in.

The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didnt just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didnt really exist until they were observed.

Physicist Paul Benioff suggests quantum mechanics could be used for computation.

Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computers power to break widely used forms of encryption.

D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the companys technology really works.

Google teams up with NASA to fund a lab to try out D-Waves hardware.

Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.

If you find that baffling, youre in good company. A year before winning a Nobel for his contributions to quantum theory, Caltechs Richard Feynman remarked that nobody understands quantum mechanics. The way we experience the world just isnt compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of themincluding Feynmanbegan to wonder if quantum phenomena like subatomic particles' dont look and I dont exist trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the worlds fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examplesat least among a very select slice of humanityinclude superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.

The looped cables connect the chip at the bottom of the structure to its control system.

Amy Lombard

You may have heard that a qubit in superposition is both 0 and 1 at the same time. Thats not quite true and also not quite falsetheres just no equivalent in Homo sapiens humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read outan operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.

It's the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.

A quantum effect so unintuitive that Einstein dubbed it spooky action at a distance. When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

The holy grail of quantum computinga measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers arent well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.

For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grovers algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grovers and some other quantum algorithms, the bigger the initial problemor phonebookthe further behind a conventional computer is left in the digital dust.

The reason we dont have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processors power to correct its own errors, caused by misfiring qubits.

Recent excitement about quantum computing stems from progress in making qubits less flaky. Thats giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. Thats still far fewer than would be needed to do useful work with a quantum computerit would probably require at least thousandsbut as recently as 2016 those companies best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

Some large companies and governments have started treating quantum computing research like a raceperhaps fittingly its one where both the distance to the finish line and the prize for getting there are unknown.

Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. Its not quite clear what the first killer apps of quantum computing will be, or when they will appear. But theres a sense that whoever is first make these machines useful will gain big economic and national security advantages.

Copper structures conduct heat well and connect the apparatus to its cooling system.

Amy Lombard

Back in the world of right now, though, quantum processors are too simple to do practical work. Google is working to stage a demonstration known as quantum supremacy, in which a quantum processor would solve a carefully designed math problem beyond existing supercomputers. But that would be an historic scientific milestone, not proof quantum computing is ready to do real work.

As quantum computer prototypes get larger, the first practical use for them will probably be for chemistry simulations. Computer models of molecules and atoms are vital to the hunt for new drugs or materials. Yet conventional computers cant accurately simulate the behavior of atoms and electrons during chemical reactions. Why? Because that behavior is driven by quantum mechanics, the full complexity of which is too great for conventional machines. Daimler and Volkswagen have both started investigating quantum computing as a way to improve battery chemistry for electric vehicles. Microsoft says other uses could include designing new catalysts to make industrial processes less energy intensive, or even to pull carbon dioxide out of the atmosphere to mitigate climate change.

Quantum computers would also be a natural fit for code-breaking. Weve known since the 90s that they could zip through the math underpinning the encryption that secures online banking, flirting, and shopping. Quantum processors would need to be much more advanced to do this, but governments and companies are taking the threat seriously. The National Institute of Standards and Technology is in the process of evaluating new encryption systems that could be rolled out to quantum-proof the internet.

When cooled to operating temperature, the whole assembly is hidden inside this white insulated casing.

Amy Lombard

Tech companies such as Google are also betting that quantum computers can make artificial intelligence more powerful. Thats further in the future and less well mapped out than chemistry or code-breaking applications, but researchers argue they can figure out the details down the line as they play around with larger and larger quantum processors. One hope is that quantum computers could help machine-learning algorithms pick up complex tasks using many fewer than the millions of examples typically used to train AI systems today.

Despite all the superposition-like uncertainty about when the quantum computing era will really begin, big tech companies argue that programmers need to get ready now. Google, IBM, and Microsoft have all released open source tools to help coders familiarize themselves with writing programs for quantum hardware. IBM has even begun to offer online access to some of its quantum processors, so anyone can experiment with them. Long term, the big computing companies see themselves making money by charging corporations to access data centers packed with supercooled quantum processors.

Whats in it for the rest of us? Despite some definite drawbacks, the age of conventional computers has helped make life safer, richer, and more convenientmany of us are never more than five seconds away from a kitten video. The era of quantum computers should have similarly broad reaching, beneficial, and impossible to predict consequences. Bring on the qubits.

The Quantum Computing Factory Thats Taking on Google and IBMPeek inside the ultra-clean workshop of Rigetti Computing, a startup packed with PhDs wearing what look like space suits and gleaming steampunk-style machines studded with bolts. In a facility across the San Francisco Bay from Silicon Valley, Rigetti is building its own quantum processors, using similar technology to that used by IBM and Google.

Why JP Morgan, Daimler Are Testing Quantum Computers That Arent Useful YetWall Street has plenty of quantsmath wizards who hunt profits using equations. Now JP Morgan has quantum quants, a small team collaborating with IBM to figure out how to use the power of quantum algorithms to more accurately model financial risk. Useful quantum computers are still years away, but the bank and other big corporations say that the potential payoffs are so large that they need to seriously investigate quantum computing today.

The Era of Quantum Computing is Here. Outlook: CloudyCompanies working on quantum computer hardware like to say that the field has transitioned from the exploration and uncertainty of science into the more predictable realm of engineering. Yet while hardware has improved markedly in recent years, and investment is surging, there are still open scientific questions about the physics underlying quantum computing.

Quantum Computing Will Create Jobs. But Which Ones?You cant create a new industry without people to staff the jobs it creates. A Congressional bill called the National Quantum Initiative seeks to have the US government invest in training the next generation of quantum computer technicians, designers, and entrepreneurs.

Job One For Quantum Computers: Boost Artificial IntelligenceArtificial intelligence and quantum computing are two of Silicon Valleys favorite buzzwords. If they can be successfully combined, machines will get a lot smarter.

Loopholes and the Anti-Realism Of the Quantum WorldEven people who can follow the math of quantum mechanics find its implications for reality perplexing. This book excerpt explains why quantum physics undermines our understanding of reality with nary an equation in sight.

Quantum Computing is the Next Security Big Security RiskIn 1994, mathematician Peter Shor wrote an algorithm that would allow a quantum computer to pierce the encryption that today underpins online shopping and other digital. As quantum computers get closer to reality, congressman Will Hurd (R-Texas) argues the US needs to lead a global effort to deploy new forms of quantum-resistant encryption.

This guide was last updated on August 24, 2018.

Enjoyed this deep dive? Check out more WIRED Guides.

See the original post here:

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Microsoft will open-source parts of Q#, the programming …

Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo)

Much has been made of Microsofts reinvention as an open-source company, and it will continue to live up to that billing Monday at Microsoft Build as the world prepares for quantum computing.

Microsoft plans to open-source the Q# compiler and quantum simulators that it includes as part of its quantum development kit sometime in the near future, the company plans to announce Monday at Build. The idea is to help researchers and universities studying quantum computing have deeper access to these tools in order to help contribute to their development and understanding of quantum technology, the company said in materials provided ahead of Build.

Quantum computing is still pretty far off in the future, but one day it is expected to allow computer scientists to bypass the limits of so-called classical computing to reach new levels of performance. Todays computers represent information using an amazingly complex string of 0s and 1s to represent data, but quantum computers will be able to use more than two states to represent data.

There are lots of different routes to quantum computing, and Microsoft is pursuing a distinct vision thats unique compared to some of the others chasing this grail. Q# is a big part of this approach, because while building a viable quantum computer is hard enough, programming one is going to require a new way of looking at the world.

Open-sourcing the compiler which takes code written by developers in a programming language and makes it run on a computer could help budding quantum developers better understand how to write more efficient code and reduce errors preventing their applications from running. And open-source simulators could make it easier for developers to test their quantum applications before letting them fly on quantum machines, which are likely to be pretty expensive in their early days.

Microsoft is expected to provide more information about its open-source quantum projects this week at Build, where more than 6,000 people are expected to attend to hear details about a lot of Microsofts current projects.

Link:

Microsoft will open-source parts of Q#, the programming ...

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

Go here to read the rest:

Quantum Computing | D-Wave Systems

Quantum technology – Wikipedia

Quantum technology is a new field of physics and engineering, which is about creating practical applications -- such as quantum computing, quantum sensors, quantum cryptography, quantum simulation, quantum metrology and quantum imaging -- based on properties of quantum mechanics, especially quantum entanglement, quantum superposition and quantum tunnelling.

Quantum superposition states can be very sensitive to a number of external effects, such as electric, magnetic and gravitational fields; rotation, acceleration and time, and therefore can be used to make very accurate sensors. There are many experimental demonstrations of quantum sensing devices, such as the experiments carried out by the Nobel laureate William D. Phillips on using cold atom interferometer systems to measure gravity and the atomic clock which is used by many national standards agencies around the world to define the second.

Recent efforts are being made to engineer quantum sensing devices, so that they are cheaper, easier to use, more portable, lighter and consume less power. It is believed that if these efforts are successful, it will lead to multiple commercial markets, such as for the monitoring of oil and gas deposits, or in construction.

Quantum secure communication are methods which are expected to be 'quantum safe' in the advent of a quantum computing systems that could break current cryptography systems. One significant component of a quantum secure communication systems is expected to be Quantum key distribution, or 'QKD': a method of transmitting information using entangled light in a way that makes any interception of the transmission obvious to the user.

Quantum computers are the ultimate quantum network, combining 'quantum bits' or 'qubit' which are devices that can store and process quantum data (as opposed to binary data) with links that can transfer quantum information between qubits. In doing this, quantum computers are predicted to calculate certain algorithms significantly faster than even the largest classical computer available today.

Quantum computers are expected to have a number of significant uses in computing fields such as optimization and machine learning. They are famous for their expected ability to carry out 'Shor's Algorithm', which can be used to factorise large numbers which are mathematically important to secure data transmission.

There are many devices available today which are fundamentally reliant on the effects of quantum mechanics. These include: laser systems, transistors and semi-conductor devices and other devices, such as MRI imagers. These devices are often referred to belonging to the 'first quantum revolution'; the UK Defence Science and Technology Laboratory (Dstl) grouped these devices as 'quantum 1.0',[1] that is devices which rely on the effects of quantum mechanics. Quantum technologies are often described as the 'second quantum revolution' or 'quantum 2.0'. These are generally regarded as a class of device that actively create, manipulate and read out quantum states of matter, often using the quantum effects of superposition and entanglement.

The field of quantum technology was first outlined in a 1997 book by Gerard J. Milburn,[2] which was then followed by a 2003 article by Jonathan P. Dowling and Gerard J. Milburn,[3][4] as well as a 2003 article by David Deutsch.[5] The field of quantum technology has benefited immensely from the influx of new ideas from the field of quantum information processing, particularly quantum computing. Disparate areas of quantum physics, such as quantum optics, atom optics, quantum electronics, and quantum nanomechanical devices, have been unified under the search for a quantum computer and given a common language, that of quantum information theory.

The Quantum Manifesto was signed by 3,400 scientists and officially released at the 2016 Quantum Europe Conference, calling for a quantum technology initiative to coordinate between academia and industry, to move quantum technologies from the laboratory to industry, and to educate quantum technology professionals in a combination of science, engineering, and business.[6][7][8][9][10]

The European Commission responded to that manifesto with the Quantum Technology Flagship [11][12], a 1 Billion, 10-year-long megaproject, similar in size to earlier European Future and Emerging Technologies Flagship projects such as the Graphene Flagship and Human Brain Project.[8][13] China is building the world's largest quantum research facility with a planned investment of 76 Billion Yuan (approx. 10 Billion).[14] The USA are preparing a national initiative.[15][16]

From 2010 onwards, multiple governments have established programmes to explore quantum technologies [17], such as the UK National Quantum Technologies Programme, which created four quantum 'hubs', the Centre for Quantum Technologies in Singapore, and QuTech a Dutch centre to develop a topological quantum computer.[18] On 22 December 2018, Donald Trump signed into law the US National Quantum Initiative Act, with a billion dollar a year budget, which is widely viewed as a response to gains in QuTech by the Chinese particularly the recent launch of the Chinese Quantum Satellite.

In the private sector, there have been multiple investments into quantum technologies made by large companies. Examples include Google's partnership with the John Martinis group at UCSB,[19] multiple partnerships with the Canadian quantum computing company D-wave systems, and investment by many UK companies within the UK quantum technologies programme.

Continue reading here:

Quantum technology - Wikipedia

IBM hits quantum computing milestone, may see ‘Quantum …

IBM is outlining another milestone in quantum computing -- its highest Quantum Volume to date -- and projects that practical uses or so called Quantum Advantage may be a decade away.

Big Blue, which will outline the scientific milestone at the American Physical Society March Meeting, made a bit of a splash at CES 2019 with a display of its Q System quantum computer and has been steadily showing progress on quantum computing.

In other words, that quantum computing buying guide for technology executives may take a while. Quantum Volume is a performance metric that indicates progress in the pursuit of Quantum Advantage. Quantum Advantage refers to the point where quantum applications deliver significant advantages to classical computers.

Also:Meet IBM's bleeding edge of quantum computingCNET

Quantum Volume is determined by the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.

IBM said its Q System One, which has a 20-qubit processor, produced a Quantum Volume of 16, double the current IBM Q, which has a Quantum Volume of 8. IBM also said the Q System One has some of the lowest error rates IBM has measured.

That progress is notable, but practical broad use cases are still years away. IBM said Quantum Volume would need to double every year to reach Quantum Advantage within the next decade. Faster progress on Quantum Advantage would speed up that timeline. IBM has doubled the power of its quantum computers annually since 2017.

Once Quantum Advantage is hit, there would be new applications, more of an ecosystem and real business use cases. Consumption of quantum computing would still likely be delivered via cloud computing since the technology has some unique characteristics that make a traditional data center look easy. IBM made its quantum computing technology available in 2016 via a cloud service and is working with partners to find business and science use cases.

Here'show quantum computing and classic computing differsvia our recent primer on the subject.

Every classical electronic computer exploits the natural behavior of electrons to produce results in accordance with Boolean logic (for any two specific input states, one certain output state). Here, the basic unit of transaction is the binary digit ("bit"), whose state is either 0 or 1. In a conventional semiconductor, these two states are represented by low and high voltage levels within transistors.

In a quantum computer, the structure is radically different. Its basic unit of registering state is the qubit, which at one level also stores a 0 or 1 state (actually 0 and/or 1). Instead of transistors, a quantum computing obtains its qubits by bombarding atoms with electrical fields at perpendicular angles to one another, the result being to line up the ions but also keep them conveniently and equivalently separated. When these ions are separated by just enough space, their orbiting electrons become the home addresses, if you will, for qubits.

Link:

IBM hits quantum computing milestone, may see 'Quantum ...

Microsofts quantum computing network takes a giant leap …

Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo)

REDMOND, Wash. Quantum computing may still be in its infancy but the Microsoft Quantum Network is all grown up, fostered by in-house developers, research affiliates and future stars of the startup world.

The network made its official debut today here at Microsofts Redmond campus, during a Startup Summit that laid out the companys vision for quantum computing and introduced network partners to Microsofts tools of the quantum trade.

Quantum computing stands in contrast to the classical computer technologies that have held sway for more than a half-century. Classical computing is based on the ones and zeroes of bit-based processing, while quantum computing takes advantage of the weird effects of quantum physics. Quantum bits, or qubits, neednt represent a one or a zero, but can represent multiple states during computation.

The quantum approach should be able to solve computational problems that cant easily be solved using classical computers, such as modeling molecular interactions or optimizing large-scale systems. That could open the way to world-changing applications, said Todd Holmdahl, corporate vice president of Microsofts Azure Hardware Systems Group.

Were looking at problems like climate change, Holmdahl said. Were looking at solving big food production problems. We think we have opportunities to solve problems around materials science, personal health care, machine learning. All of these things are possible and obtainable with a quantum computer. We have been talking around here that were at the advent of the quantum economy.

Representatives from 16 startups were invited to this weeks Startup Summit, which features talks from Holmdahl and other leaders of Microsofts quantum team as well as demos and workshops focusing on Microsofts programming tools. (The closest startup to Seattle is 1QBit, based in Vancouver, B.C.)

Over the past year and a half, Microsoft has released a new quantum-friendly programming language called Q# (Q-sharp) as part of its Quantum Development Kit, and has worked with researchers at Pacific Northwest National Laboratory and academic institutions around the world to lay the technical groundwork for the field.

A big part of that groundwork is the development ofa universal quantum computer, based on a topological architecture that builds error-correcting mechanisms right into the cryogenically cooled, nanowire-based hardware. Cutting down on the error-producing noise in quantum systems will be key to producing a workable computer.

We believe that our qubit equals about 1,000 of our competitions qubits, Holmdahl said.

Theres lots of competition in the quantum computing field nowadays: IBM, Google and Intel are all working on similar technologies for a universal quantum computer, while Canadas D-Wave Systems is taking advantage of a more limited type of computing technology known as quantum annealing.

This week, D-Wave previewed its plans for a new type of computer topology that it said would reduce quantum noise and more than double the qubit count of its existing platform, from 2,000 linked qubits to 5,000.

But the power of quantum computing shouldnt be measured merely by counting qubits. The efficiency of computation and the ability to reduce errors can make a big difference, said Microsoft principal researcher Matthias Troyer.

For example, a standard approach to simulating the molecular mechanism behind nitrogen fixation for crops could require 30,000 years of processing time, he said. But if the task is structured to enable parallel processing and enhanced error correction, the required runtime can be shrunk to less than two days.

Quantum software engineering is really as important as the hardware engineering, Troyer said.

Julie Love, director of Microsoft Quantum Business Development, said that Microsoft will start out offering quantum computing through Miicrosofts Azure cloud-based services. Not all computational problems are amenable to the quantum approach: Its much more likely that an application will switch between classical and quantum processing and therefore, between classical tools such as the C# programming language and quantum tools such as Q#.

When you work in chemistry and materials, all of these problems, you hit this known to be unsolvable problem, Love said. Quantum provides the possibility of a breakthrough.

Love shies away from giving a firm timetable for the emergence of specific applications but last year, Holmdahl predicted that commercial quantum computers would exist five years from now. (Check back in 2023 to see how the prediction panned out.)

The first applications could well focus on simulating molecular chemistry, with the aim of prototyping better pharmaceuticals, more efficient fertilizers, better batteries, more environmentally friendly chemicals for the oil and gas industry, and a new class of high-temperature superconductors. It might even be possible to address the climate change challenge by custom-designing materials that pull excess carbon dioxide out of the air.

Love said quantum computers would also be well-suited for addressing optimization problems, like figuring out how to make traffic flow better through Seattles urban core; and for reducing the training time required for AI modeling.

That list is going to continue to evolve, she said.

Whenever the subject quantum computing comes up, cryptography has to be mentioned as well. Its theoretically possible for a quantum computer to break the codes that currently protect all sorts of secure transactions, ranging from email encryption to banking protocols.

Love said those code-breaking applications are farther out than other likely applications, due to the huge amount of computation resources that would be required even for a quantum computer. Nevertheless, its not too early to be concerned. We have a pretty significant research thrust in whats called post-quantum crypto, she said.

Next-generation data security is one of the hot topics addressed $1.2 billion National Quantum Initiative that was approved by Congress and the White House last December. Love said Microsofts post-quantum crypto protocols have already gone through an initial round of vetting by the National Institute of Standards and Technology.

Weve been working at this in a really open way, she said.

Like every technology, quantum computing is sure to have a dark side as well as a bright side. But its reassuring to know that developers are thinking ahead about both sides.

Excerpt from:

Microsofts quantum computing network takes a giant leap ...

Quantum Computing | Centre for Quantum Computation and …

UNSW researchers at CQC2T have shown for the first time that they can build atomic precision qubits in a 3D device another major step towards a universal quantum computer.

The researchers, led by 2018 Australian of the Year and Director of CQC2T Professor Michelle Simmons, have demonstrated that they can extend their atomic qubit fabrication technique to multiple layers of a silicon crystal achieving a critical component of the 3D chip architecture that they introduced to the world in 2015. This new research is published today in Nature Nanotechnology.

The group is the first to demonstrate the feasibility of an architecture that uses atomic-scale qubits aligned to control lines which are essentially very narrow wires inside a 3D design. Whats more, team members were able to align the different layers in their 3D device with nanometer precision and showed they could read out qubit states single shot, i.e. within one single measurement, with very high fidelity.

This 3D device architecture is a significant advancement for atomic qubits in silicon, says Professor Simmons.

Read full articleRead Nature Nanotechnology publicationWatch Video: https://youtu.be/8JB7ncztJWs

Read the original here:

Quantum Computing | Centre for Quantum Computation and ...

Quantum Computing – VLAB

Quantum ComputingTechnology Nirvana or Security Armageddon?

Quantum computing promises to create the most fundamental change in the history of computing. The increases in processing power and speed will enable new capabilities that would take years or maybe are simply not possible using classical computing technologies. From molecular and financial modeling to weather forecasting and artificial intelligence, quantum computing represents the biggest advance in decades.

Perhaps the greatest impact and most dangerous threat will be in cryptography. Capable of instantly breaking todays strongest data encryption algorithms, quantum computing is a major focus of governments, multinational corporations, and a growing number of startups around the globe. Theoretically, the first organization to build a quantum computer will have the power to break any existing security key anywhere, potentially wreaking havoc on entire societies, militaries, and economies. The race is on.

Join us on November 15 to find out.

Moderator

Panelists

Alexei Marchenkov, Founder and CEO, Bleximo

Louis Parks, Founder and CEO, SecureRF

Pete Shadbolt, Chief Science Officer, psiQuantum

Hratch Achadjian, Quantum Computing & AI, Head of Business Development North America, Google

Joseph Raffa, Director, IBM Ventures

Ticket Includes Food and Drink

Hors doeuvres, beverages, wine, beer

Read more from the original source:

Quantum Computing - VLAB

The 3 Types of Quantum Computers and Their Applications

on March 14, 2016 at 11:42 am

Its an exciting time in computing.

Just days ago, Googles AlphaGo AI took an insurmountable lead in the 3,000 year-old game of Go against the reigning world champion, Lee Sedol. In a five-game series, the score is now 3-1 for the machine with one game left on March 15, 2016 in Seoul, South Korea.

While IBMs Deep Blue beat reigning chess champion Garry Kasparov in 1997 by using brute force, Go is a game with more possible moves than atoms in the known universe (literally). Therefore, the technology doesnt yet exist to make such calculations in short amounts of time.

Google had to take a different approach: to beat the grand master, it needed to enable AlphaGo to self-improve through deep learning.

AlphaGos historical decision is a milestone for artificial intelligence, and now the technology community is anxiously waiting to see whats next for AI. Some say that it is beating a human world champion at a real-time strategy game such as Starcraft, while others look to quantum computing technology that could raise the potential power of AI exponentially.

While everyday analog computing is limited to having a single value of either 0 or 1 for each bit, quantum computing uses quantum bits (qubits) that are simultaneously in both states (0 and 1) at the same time.

The consequence of this superposition, as its called, is that quantum computers are able to test every solution of a problem at once. Further, because of this exponential relationship, such computers should be able to double their quantum computing power with each additional qubit.

Image credit: Universe Review

There are three types of quantum computers that are considered to be possible by IBM. Shown in the above infographic, they range from a quantum annealer to a universal quantum.

The quantum annealer has been successfully developed by Canadian company D-Wave, but it is difficult to tell whether it actually has any real quantumness thus far. Google added credibility to this notion in December 2015, when it revealed tests showing that its D-Wave quantum computer was 3,600 times faster than a supercomputer at solving specific, complex problems.

Expert opinion, however, is still skeptical on these claims. Such criticisms also shed light on the major limitation of quantum annealers, which is that they may only be engineered to solve very specific optimization problems, and have limited general practicality.

The holy grail of quantum computing is the universal quantum, which could allow for exponentially faster calculations with more generality.

However, building such a device ends up posing a number of important technical challenges. Quantum particles turn out to be quite fickle, and the smallest interference from light or sound can create errors in the computing process.

Doing calculations at exponential speeds is not very useful when those calculations are incorrect.

IBM highlights just some of the possibilities around universal quantum computers in a recent press release:

A universal quantum computer uses quantum mechanics to process massive amounts of data and perform computations in powerful new ways not possible with todays conventional computers. This type of leap forward in computing could one day shorten the time to discovery for life-saving cancer drugs to a fraction of what it is today; unlock new facets of artificial intelligence by vastly accelerating machine learning; or safeguard cloud computing systems to be impregnable from cyber-attack.

This means that quantum computing could be a trillion dollar market, touching massive future markets such as artificial intelligence, robotics, defense, cryptography, and pharmaceuticals.

However, until a universal quantum can be built, the market remains fairly limited in size and focused on R&D. Quantum computing is expected to surpass a market of $5 billion market by 2020.

As a final note: its worth seeing where quantum computing sits on Gartners emerging technology hype cycle:

Gartner still describes it as being 10 years or more away from reaching the plateau.

Embed This Image On Your Site (copy code below):

Related

Jeff is the Editor-in-Chief of Visual Capitalist, a media site that creates and curates visuals on business and investing. He has been quoted or featured on Business Insider, Forbes, CNBC, MarketWatch, The Huffington Post, The World Economic Forum, and Fast Company.

Here is the original post:

The 3 Types of Quantum Computers and Their Applications

The reality of quantum computing could be just three years …

Quantum computing has moved out of the realm of theoretical physics and into the real world, but its potential and promise are still years away.

Onstage at TechCrunch Disrupt SF, a powerhouse in the world of quantum research and a young upstart in the field presented visions for the future of the industry that illustrated both how far the industry has come and how far the technology has to go.

For both Dario Gil, the chief operating officer of IBM Research and the companys vice president of artificial intelligence and quantum computing, and Chad Rigetti, a former IBM researcher who founded Rigetti Computing and serves as its chief executive, the moment that a quantum computer will be able to perform operations better than a classical computer is only three years away.

[Its] generating a solution that is better, faster or cheaper than you can do otherwise, said Rigetti.Quantum computing has moved out of a field of research into now an engineering discipline and an engineering enterprise.

Considering the more than 30 years that IBM has been researching the technology and the millions (or billions) that have been poured into developing it, even seeing an end of the road is a victory for researchers and technologists.

Achieving this goal, for all of the brainpower and research hours that have gone into it, is hardly academic.

The Chinese government is building a $10 billion National Laboratory for Quantum Information in Anhui province, which borders Shanghai and is slated to open in 2020. Meanwhile, the U.S. public research into quantum computing is running at around $200 million per year.

One of the reasons why governments, especially, are so interested in the technology is its potential to completely remake the cybersecurity landscape. Some technologists argue that quantum computers will have the potential to crack any type of encryption technology, opening up all of the networks in the world to potential hacking.

Of course, quantum computing is so much more than security. It will enable new ways of doing things we cant even imagine because we have never had this much pure compute power. Think about artificial and machine learning or drug development; any type of operation that is compute-intensive could benefit from the exponential increase in compute power that quantum computing will bring.

Security may be the Holy Grail for governments, but both Rigetti and Gil say that the industrial chemical business will be the first place where the potentially radical transformation of a market will appear first.

To understand quantum computing it helps to understand the principles of the physics behind it.

As Gil explained onstage (and on our site), quantum computing depends on the principles of superposition, entanglement and interference.

Superposition is the notion that physicists can observe multiple potential states of a particle. If you a flip a coin it is one or two states, said Gil. Meaning that theres a single outcome that can be observed. But if someone were to spin a coin, theyd see a number of potential outcomes.

Once youve got one particle thats being observed, you can add another and pair them thanks to a phenomenon called quantum entanglement. If you have two coins where each one can be in superpositions and then you can have measurements can be taken of the difference of both.

Finally, theres interference, where the two particles can be manipulated by an outside force to change them and create different outcomes.

In classical systems you have these bits of zeros and ones and the logical operations of the ands and the ors and the nots, said Gil. The classical computer is able to process the logical operations of bits expressed in zeros and ones.

In an algorithm you put the computer in a super positional state, Gil continued. You can take the amplitude and states and interfere them and the algorithm is the thing that interferes I can have many, many states representing different pieces of information and then i can interfere with it to get these data.

These operations are incredibly hard to sustain. In the early days of research into quantum computing the superconducting devices only had one nanosecond before a qubit transforms into a traditional bit of data. Those ranges have increased between 50 and 100 microseconds, which enabled IBM and Rigetti to open up their platforms to researchers and others to conduct experimentation (more on that later).

As one can imagine, dealing with quantum particles is a delicate business. So the computing operations have to be carefully controlled.At the base of the machine is what basically amounts to a huge freezer that maintains a temperature in the device of 15 millikelvin near absolute zero degrees and 180 times colder than the temperatures in interstellar space.

These qubits are very delicate, said Gil. Anything from the outside world can couple to it and destroy its state and one way to protect it is to cool it.

Wiring for the quantum computer is made of superconducting coaxial cables. The inputs to the computers are microwave pulses that manipulates the particles creating a signal that is then interpreted by the computers operators.

Those operators used to require a degree in quantum physics. But both IBM and Rigetti have been working on developing tools that can enable a relative newbie to use the tech.

Even as companies like IBM and Rigetti bring the cost of quantum computing down from tens of millions of dollars to roughly $1 million to $2 million, these tools likely will never become commodity hardware that a consumer buys to use as a personal computer.

Rather, as with most other computing these days, quantum computing power will be provided as a service to users.

Indeed, Rigetti announced onstage a new hybrid computing platform that can provide computing services to help the industry both reach quantum advantage that tipping point at which quantum is commercially viable and to enable industries to explore the technologies to acclimatize to the potential ways in which typical operations could be disrupted by it.

A user logs on to their own device and use our software development kit to write a quantum application, said Rigetti. That program is sent to a compiler and kicks off an optimization kit that runs on a quantum and classical computer This is the architecture thats needed to achieve quantum advantage.

Both IBM and Rigetti and a slew of other competitors are preparing users for accessing quantum computing opportunities on the cloud.

IBM has more than a million chips performing millions of quantum operations requested by users in over 100 countries around the world.

In a cloud-first era Im not sure the economic forces will be there that will drive us to develop the miniaturized environment in the laptop, Rigetti said. But the ramifications of the technologys commercialization will be felt by everyone, everywhere.

Quantum computing is going to change the world and its all going to come in our lifetime, whether thats two years or five years, he said. Quantum computing is going to redefine every industry and touch every market. Every major company will be involved in some capacity in that space.

View original post here:

The reality of quantum computing could be just three years ...

How Quantum Computers Work

A quantum computer is a computer design which uses the principles of quantum physics to increase the computational power beyond what is attainable by a traditional computer. Quantum computers have been built on the small scale and work continues to upgrade them to more practical models.

Computers function by storing data in a binary number format, which result in a series of 1s & 0s retained in electronic components such as transistors. Each component of computer memory is called a bit and can be manipulated through the steps of Boolean logic so that the bits change, based upon the algorithms applied by the computer program, between the 1 and 0 modes (sometimes referred to as "on" and "off").

A quantum computer, on the other hand, would store information as either a 1, 0, or a quantum superposition of the two states. Such a "quantum bit" allows for far greater flexibility than the binary system.

Specifically, a quantum computer would be able to perform calculations on a far greater order of magnitude than traditional computers ... a concept which has serious concerns and applications in the realm of cryptography & encryption. Some fear that a successful & practical quantum computer would devastate the world's financial system by ripping through their computer security encryptions, which are based on factoring large numbers that literally cannot be cracked by traditional computers within the lifespan of the universe. A quantum computer, on the other hand, could factor the numbers in a reasonable period of time.

To understand how this speeds things up, consider this example. If the qubit is in a superposition of the 1 state and the 0 state, and it performed a calculation with another qubit in the same superposition, then one calculation actually obtains 4 results: a 1/1 result, a 1/0 result, a 0/1 result, and a 0/0 result. This is a result of the mathematics applied to a quantum system when in a state of decoherence, which lasts while it is in a superposition of states until it collapses down into one state. The ability of a quantum computer to perform multiple computations simultaneously (or in parallel, in computer terms) is called quantum parallelism).

The exact physical mechanism at work within the quantum computer is somewhat theoretically complex and intuitively disturbing. Generally, it is explained in terms of the multi-world interpretation of quantum physics, wherein the computer performs calculations not only in our universe but also in other universes simultaneously, while the various qubits are in a state of quantum decoherence. (While this sounds far-fetched, the multi-world interpretation has been shown to make predictions which match experimental results. Other physicists have )

Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. (This speech is also generally considered the starting point of nanotechnology.)

Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman's suggestions into reality.

In 1985, the idea of "quantum logic gates" was put forth by University of Oxford's David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch's paper on the subject showed that any physical process could be modeled by a quantum computer.

Nearly a decade later, in 1994, AT&T's Peter Shor devised an algorithm that could use only 6 qubits to perform some basic factorizations ... more cubits the more complex the numbers requiring factorization became, of course.

A handful of quantum computers has been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps does show that the fundamental theory is sound.

The quantum computer's main drawback is the same as its strength: quantum decoherence. The qubit calculations are performed while the quantum wave function is in a state of superposition between states, which is what allows it to perform the calculations using both 1 & 0 states simultaneously.

However, when a measurement of any type is made to a quantum system, decoherence breaks down and the wave function collapses into a single state. Therefore, the computer has to somehow continue making these calculations without having any measurements made until the proper time, when it can then drop out of the quantum state, have a measurement taken to read its result, which then gets passed on to the rest of the system.

The physical requirements of manipulating a system on this scale are considerable, touching on the realms of superconductors, nanotechnology, and quantum electronics, as well as others. Each of these is itself a sophisticated field which is still being fully developed, so trying to merge them all together into a functional quantum computer is a task which I don't particularly envy anyone ... except for the person who finally succeeds.

View original post here:

How Quantum Computers Work

What are quantum computers and how do they work? WIRED …

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can't even scratch the surface of.

But the quantum future isn't going to come easily, and there's no knowing what it'll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here's everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'," says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Read the rest here:

What are quantum computers and how do they work? WIRED ...

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: March 9, 2018.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphone than you'd have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world's most powerfulcomputersand there's no guarantee we'll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey'll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today's, we'll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the "classical," sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let's take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it's much moreand much lessthan that. It's more, because it's a completely general-purposemachine: you can make it do virtually anything you like. It'sless, because inside it's little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer's keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it's on, we can use a transistor to store a number one(1); if it's off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat's the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it's been doing so eversince. This apparently unshakeable trend is known as Moore's Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That's roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we're talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you're being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they'll need to take and the longerthey'll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore's Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we're getting to the point where the laws of physics seem likelyto put a stop to Moore's Law. Unfortunately, there are still hugelydifficult computing problems we can't tackle because even the mostpowerful computers find them intractable. That's one of the reasonswhy people are now getting interested in quantum computing.

Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen.

Richard Feynman

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen." (Six Easy Pieces, p116.)

If you've studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it's made up of particles (like a steady stream ofcannonballs), and sometimes as though it's waves of energy ripplingthrough space (a bit like waves on the sea). That's called wave-particle dualityand it's one of the ideas that comes to us from quantum theory. It's hard to grasp thatsomething can be two things at oncea particle and awavebecause it's totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger's cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore's Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can't. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumventthis problem by working in a "reversible" way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University's David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger's cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you're playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they're added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it's actually in at any given moment(by measuring it, in other words) does it "collapse" into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer's ability to work in parallel would make it millions of times faster thanany conventional computer... if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons), or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN. Now you wouldn't be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don't worry if you don't understand; not many people do. Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that's by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the "prime factors" of a large number, whichwould speed up the problem enormously. Shor's algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan "intractable" computer problem). If quantum computers couldindeed factor large numbers quickly, today's online security could berendered obsolete at a stroke. But what goes around comes around,and some researchers believe quantum technology will lead tomuch stronger forms of encryption.(In 2017, Chinese researchers demonstrated for the first timehow quantum encryption could be used to make a very secure video callfrom Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there's been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM'sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine; the announcement proved highly controversialand there was a lot of debate over whether the company's machines had really demonstrated quantum behavior.Three years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach.In March 2015, the Google team announced they were "a step closer to quantum computation," having developeda new way for qubits to detect and protect against errors.In 2016, MIT's Isaac Chuang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster.

There's no doubt that these are hugely important advances.and the signs are growing steadily more encouraging that quantumtechnology will eventually deliver a computing revolution.In December 2017, Microsoft unveiled a completequantum development kit, including a new computer language, Q#, developed specifically forquantum applications. In early 2018,D-wave announced plans to start rolling out quantum power to acloud computing platform.A few weeks later, Google announced Bristlecone, a quantum processorbased on a 72-qubit array, that might, one day, form the cornerstone of a quantum computer that could tackle real-world problems.All very exciting! Even so, it's early days for the whole field, and mostresearchers agree that we're unlikely to see practical quantumcomputers appearing for some yearsand more likely several decades.

Read the original here:

Quantum computing: A simple introduction - Explain that Stuff

Quantum Computing Explained – WIRED UK

Ray Orange

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can't even scratch the surface of.

But the quantum future isn't going to come easily, and there's no knowing what it'll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here's everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

D-Wave

"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'," says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Continue reading here:

Quantum Computing Explained - WIRED UK

What is Quantum Computing? Webopedia Definition

Main TERM Q

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer's processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers.

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.

A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such as cryptography and modeling and indexing very large databases.

Microsoft: Quantum Computing 101

Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

See the article here:

What is Quantum Computing? Webopedia Definition

Is Quantum Computing an Existential Threat to Blockchain …

Amid steep gains in value and wild headlines, its easy to forget cryptocurrencies and blockchain arent yet mainstream. Even so, fans of the technology believe blockchain has too much potential not to have a major sustained impact in the future.

But as is usually the case when pondering whats ahead, nothing is certain.

When considering existential threats to blockchain and cryptocurrencies, people generally focus on increased regulation. And this makes sense. In the medium term, greater regulation may stand in the way of cryptocurrencies and wider mainstream adoption. However, there might be a bigger threat further out on the horizon.

Much of blockchains allure arises from its security benefits. The tech allows a ledger of transactions to be distributed between a large network of computers. No single user can break into and change the ledger. This makes it both public and secure.

But combined with another emerging (and much hyped) technology, quantum computing, blockchains seemingly immutable ledgers would be under threat.

Like blockchain, quantum computing has been making progress and headlines too.

The number of quantum computing companies and researchers continues to grow. And while there is a lot of focus on hardware, many are looking into the software as well.

Cryptography is a commonly debated topic because quantum computing poses a threat to traditional forms of computer security, most notably public key cryptography, which undergirds most online communications and most current blockchain technology.

But first, how does computer security work today?

Public key cryptography uses a pair of keys to encrypt information: a public key which can be shared widely and a private key known only to the keys owner. Anyone can encrypt a message using the intended receivers public key, but only the receiver can decrypt the message using her private key. The more difficult it is to determine a private key from its corresponding public key, the more secure the system.

The best public key cryptography systems link public and private keys using the factors of a number that is the product of two incredibly large prime numbers. To determine the private key from the public key alone, one would have to figure out the factors of this product of primes. Even if a classical computer tested a trillion keys a second, it would take up to 785 million times longer than the roughly 14 billion years the universe has existed so far due to the size of the prime numbers in question.

If processing power were to greatly increase, however, then it might become possible for an entity exercising such computing power to generate a private key from the corresponding public key. If actors could generate private keys from corresponding public keys, then even the strongest forms of traditional public key cryptography would be vulnerable.

This is where quantum computing comes in. Quantum computing relies on quantum physics and has more potential power than any traditional form of computing.

Quantum computing takes advantage of quantum bits or qubits that can exist in any superposition of values between 0 and 1 and can therefore process much more information than just 0 or 1, which is the limit of classical computing systems.

The capacity to compute using qubits renders quantum computers many orders of magnitude faster than classical computers. Google showed a D-Wave quantum annealing computer could be 100 million times faster than classical computers at certain specialized tasks. And Google and IBM are working on their own quantum computers.

Further, although there are but a handful of quantum computing algorithms, one of the most famous ones, Shors algorithm, allows for the quick factoring of large primes. Therefore, a working quantum computer could, in theory, break todays public key cryptography.

Quantum computers capable of speedy number factoring are not here yet. However, if quantum computing continues to progress, it will get there eventually. And when it does, this advance will pose an existential threat to public key cryptography, and the blockchain technology that relies on it, including Bitcoin, will be vulnerable to hacking.

So, is blockchain security therefore impossible in a post-quantum world? Will the advent of quantum computing render blockchain technology obsolete?

Maybe, but not if we can develop a solution first.

The NSA announced in 2015 that it was moving to implement quantum-resistant cryptographic systems. Cryptographers are working on quantum-resistant cryptography, and there are already blockchain projects implementing quantum-resistant cryptography. The Quantum Resistant Ledger team, for example, is working on building such a blockchain right now.

What makes quantum-resistant or post-quantum cryptography, quantum resistant? When private keys are generated from public keys in ways that are much more mathematically complex than traditional prime factorization.

The Quantum Resistant Ledger team is working to implement hash-based cryptography, a form of post-quantum cryptography. In hash-based cryptography, private keys are generated from public keys using complex hash-based cryptographic structures, rather than prime number factorization. The connection between the public and private key pair is therefore much more complex than in traditional public key cryptography and would be much less vulnerable to a quantum computer running Shors algorithm.

These post-quantum cryptographic schemes do not need to run on quantum computers. The Quantum Resistant Ledger is a blockchain project already working to implement post-quantum cryptography. It remains to be seen how successful the effort and others like it will prove when full-scale quantum computing becomes a practical reality.

To be clear, quantum computing threatens all computer security systems that rely on public key cryptography, not just blockchain. All security systems, including blockchain systems, need to consider post-quantum cryptography to maintain data security for their systems. But the easiest and most efficient route may be to replace traditional systems with blockchain systems that implement quantum-resistant cryptography.

Disclosure: The author owns assorted digital assets. The author is also a principal at Crypto Lotus LLC, a cryptocurrency hedge fund based out of the San Francisco Bay Area, and an advisor at Green Sands Equity, both of which have positions in various digital assets. All opinions in this post are the authors alone and not those of Singularity University, Crypto Lotus, or Green Sands Equity. This post is not an endorsement by Singularity University, Crypto Lotus, or Green Sands Equity of any asset, and you should be aware of the risk of loss before trading or holding any digital asset.

Image Credit: Morrowind /Shutterstock.com

Follow this link:

Is Quantum Computing an Existential Threat to Blockchain ...

Quantum computers – WIRED UK

Ray Orange

Google, IBM and a handful of startups are racing to create the next generation of supercomputers. Quantum computers, if they ever get started, will help us solve problems, like modelling complex chemical processes, that our existing computers can't even scratch the surface of.

But the quantum future isn't going to come easily, and there's no knowing what it'll look like when it does arrive. At the moment, companies and researchers are using a handful of different approaches to try and build the most powerful computers the world has ever seen. Here's everything you need to know about the coming quantum revolution.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

D-Wave

"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'," says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states at either of the two poles of the sphere a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Until recently, it seemed like Google was leading the pack when it came to creating a quantum computer that could surpass the abilities of conventional computers. In a Nature article published in March 2017, the search giant set out ambitious plans to commercialise quantum technology in the next five years. Shortly after that, Google said it intended to achieve something its calling quantum supremacy with a 49-qubit computer by the end of 2017.

Now, quantum supremacy, which roughly refers to the point where a quantum computer can crunch sums that a conventional computer couldnt hope to simulate, isnt exactly a widely accepted term within the quantum community. Those sceptical of Googles quantum project or at least the way it talks about quantum computing argue that supremacy is essentially an arbitrary goal set by Google to make it look like its making strides in quantum when really its just meeting self-imposed targets.

Whether its an arbitrary goal or not, Google was pipped to the supremacy post by IBM in November 2017, when the company announced it had built a 50-qubit quantum computer. Even that, however, was far from stable, as the system could only hold its quantum microstate for 90 microseconds, a record, but far from the times needed to make quantum computing practically viable. Just because IBM has built a 50-qubit system, however, doesnt necessarily mean theyve cracked supremacy and definitely doesnt mean that theyve created a quantum computer that is anywhere near ready for practical use.

Where IBM has gone further than Google, however, is making quantum computers commercially available. Since 2016, it has offered researchers the chance to run experiments on a five-qubit quantum computer via the cloud and at the end of 2017 started making its 20-qubit system available online too.

But quantum computing is by no means a two-horse race. Californian startup Rigetti is focusing on the stability of its own systems rather than just the number of qubits and it could be the first to build a quantum computer that people can actually use. D-Wave, a company based in Vancouver, Canada, has already created what it is calling a 2,000-qubit system although many researchers dont consider the D-wave systems to be true quantum computers. Intel, too, has skin in the game. In February 2018 the company announced that it had found a way of fabricating quantum chips from silicon, which would make it much easier to produce chips using existing manufacturing methods.

Quantum computers operate on completely different principles to existing computers, which makes them really well suited to solving particular mathematical problems, like finding very large prime numbers. Since prime numbers are so important in cryptography, its likely that quantum computers would quickly be able to crack many of the systems that keep our online information secure. Because of these risks, researchers are already trying to develop technology that is resistant to quantum hacking, and on the flipside of that, its possible that quantum-based cryptographic systems would be much more secure than their conventional analogues.

Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers arent very good at all. In July 2016, Google engineers used a quantum device to simulate a hydrogen molecule for the first time, and since them IBM has managed to model the behaviour of even more complex molecules. Eventually, researchers hope theyll be able to use quantum simulations to design entirely new molecules for use in medicine. But the holy grail for quantum chemists is to be able to model the Haber-Bosch process a way of artificially producing ammonia that is still relatively inefficient. Researchers are hoping that if they can use quantum mechanics to work out whats going on inside that reaction, they could discover new ways to make the process much more efficient.

Continue reading here:

Quantum computers - WIRED UK

What is quantum computing? – Definition from WhatIs.com

Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world.

Quantum theory's development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called "quanta"), as does matter. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don't look to check.

To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger's Cat. First, we have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both dead and alive, according to quantum law - in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory.

Which ever argument one chooses, the principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time - that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.

The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we'll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement .

Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from a laser - let's say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number - 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing - classical computers today, even so called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.

Entanglement Particles (such as photons, electrons, or qubits) that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation . Knowing the spin state of one entangled particle - up or down - allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superpostition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein called it "spooky action at a distance"), the mechanism of which cannot, as yet, be explained by any theory - it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Perhaps even more intriguing than the sheer power of quantum computing is the ability that it offers to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations" - something which is meaningless with a classical computer - which would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers, one example of which we discuss below.

There have been two notable successes thus far with quantum programming. The first occurred in 1994 by Peter Shor, (now at AT&T Labs) who developed a quantum algorithm that could efficiently factorize large numbers. It centers on a system that uses number theory to estimate the periodicity of a large number sequence. The other major breakthrough happened with Lov Grover of Bell Labs in 1996, with a very fast algorithm that is proven to be the fastest possible for searching through unstructured databases. The algorithm is so efficient that it requires only, on average, roughly N square root (where N is the total number of elements) searches to find the desired result, as opposed to a search in classical computing, which on average needs N/2 searches.

The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the problems with quantum computing are as follows:

Even though there are many problems to overcome, the breakthroughs in the last 15 years, and especially in the last 3, have made some form of practical quantum computing not unfeasible, but there is much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis. It is this potential that is rapidly breaking down the barriers to this technology, but whether all barriers can be broken, and when, is very much an open question.

Read the original post:

What is quantum computing? - Definition from WhatIs.com