{"id":191558,"date":"2017-05-06T04:07:01","date_gmt":"2017-05-06T08:07:01","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/quantum-computing-a-simple-introduction-explain-that-stuff\/"},"modified":"2017-05-06T04:07:01","modified_gmt":"2017-05-06T08:07:01","slug":"quantum-computing-a-simple-introduction-explain-that-stuff","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/quantum-computing\/quantum-computing-a-simple-introduction-explain-that-stuff\/","title":{"rendered":"Quantum computing: A simple introduction &#8211; Explain that Stuff"},"content":{"rendered":"<p><p>    by Chris Woodford.    Last updated: February 18, 2017.  <\/p>\n<p>    How can you get more and more out    of less and less? The smaller computers get, the more powerful    they seem to become: there's more number-crunching ability in a    21st-century cellphone than you'd    have found in a room-sized, military computer 50 years ago.    Yet, despite such amazing advances, there are still plenty of    complex problems that are beyond the reach of even the world's    most powerful computersand there's no guarantee we'll ever be    able to tackle them. One problem is that the basic switching    and memory units of computers, known as transistors, are now    approaching the point where they'll soon be as small as    individual atoms. If we want computers that are smaller and    more powerful than today's, we'll soon need to do our computing    in a radically different way. Entering the realm of atoms opens    up powerful new possibilities in the shape of quantum computing, with processors that could    work millions of times faster than the ones we use today.    Sounds amazing, but the trouble is that quantum computing is    hugely more complex than traditional computing and operates in    the Alice in Wonderland world of quantum physics, where the    \"classical,\" sensible, everyday laws of physics no longer    apply. What is quantum computing and how does it work? Let's    take a closer look!  <\/p>\n<p>    Photo: Quantum computing means storing and    processing information using individual atoms, ions, electrons,    or photons. On the plus side, this opens up the possibility of    faster computers, but the drawback is the greater complexity of    designing computers that can operate in the weird world of    quantum physics. Photo courtesy of US    Department of Energy.  <\/p>\n<p>    You probably think of a computer as a neat little gadget    that sits on your lap and lets you send emails, shop online,    chat to your friends, or play gamesbut it's much more and much    less than that. It's more, because it's a completely    general-purpose machine: you can make it do virtually anything    you like. It's less, because inside it's little more than an    extremely basic calculator,    following a prearranged set of instructions called a    program. Like the Wizard of Oz, the    amazing things you see in front of you conceal some pretty    mundane stuff under the covers.  <\/p>\n<\/p>\n<p>    Photo: This is what one transistor from a    typical radio circuit board looks    like. In computers, the transistors are much smaller than this    and millions of them are packaged together onto    microchips.  <\/p>\n<p>    Conventional computers have two tricks that they do really    well: they can store numbers in memory and they can    process stored numbers with simple mathematical operations    (like add and subtract). They can do more complex things by    stringing together the simple operations into a series called    an algorithm (multiplying can be done    as a series of additions, for example). Both of a computer's    key tricksstorage and processingare accomplished using    switches called transistors, which are like    microscopic versions of the switches you have on your wall for    turning on and off the lights. A transistor can either be on or    off, just as a light can either be lit or unlit. If it's on, we    can use a transistor to store a number one (1); if it's off, it    stores a number zero (0). Long strings of ones and zeros can be    used to store any number, letter, or symbol using a code based    on binary (so computers store an    upper-case letter A as 1000001 and a lower-case one as    01100001). Each of the zeros or ones is called a binary digit    (or bit) and, with a string of eight    bits, you can store 255 different characters (such as A-Z, a-z,    0-9, and most common symbols). Computers calculate by using    circuits called logic gates,    which are made from a number of transistors connected together.    Logic gates compare patterns of bits, stored in temporary    memories called registers, and then    turn them into new patterns of bitsand that's the computer    equivalent of what our human brains would call addition,    subtraction, or multiplication. In physical terms, the    algorithm that performs a particular calculation takes the form    of an electronic circuit made    from a number of logic gates, with the output from one gate    feeding in as the input to the next.  <\/p>\n<p>    The trouble with conventional computers is that they depend on    conventional transistors. This might not sound like a problem    if you go by the amazing progress made in electronics over the    last few decades. When the transistor was invented, back in    1947, the switch it replaced (which was called the vacuum tube) was about as big as one of your    thumbs. Now, a state-of-the-art microprocessor (single-chip    computer) packs hundreds of millions (and up to two billion)    transistors onto a chip of silicon the size of your fingernail!    Chips like these, which are called integrated circuits, are an    incredible feat of miniaturization. Back in the 1960s, Intel    co-founder Gordon Moore realized that the power of computers    doubles roughly 18 monthsand it's been doing so ever since.    This apparently unshakeable trend is known as Moore's Law.  <\/p>\n<\/p>\n<p>    Photo: This memory chip from a typical    USB stick contains an integrated circuit    that can store 512 megabytes of data. That's roughly 500    million characters (536,870,912 to be exact), each of which    needs eight binary digitsso we're talking about 4 billion    (4,000 million) transistors in all (4,294,967,296 if you're    being picky) packed into an area the size of a postage    stamp!  <\/p>\n<p>    It sounds amazing, and it is, but it misses the point. The more    information you need to store, the more binary ones and    zerosand transistorsyou need to do it. Since most    conventional computers can only do one thing at a time, the    more complex the problem you want them to solve, the more steps    they'll need to take and the longer they'll need to do it. Some    computing problems are so complex that they need more computing    power and time than any modern machine could reasonably supply;    computer scientists call those intractable problems.  <\/p>\n<p>    As Moore's Law advances, so the number of intractable problems    diminishes: computers get more powerful and we can do more with    them. The trouble is, transistors are just about as small as we    can make them: we're getting to the point where the laws of    physics seem likely to put a stop to Moore's Law.    Unfortunately, there are still hugely difficult computing    problems we can't tackle because even the most powerful    computers find them intractable. That's one of the reasons why    people are now getting interested in quantum computing.  <\/p>\n<p>    Quantum theory is the branch of    physics that deals with the world of atoms and the smaller (subatomic) particles    inside them. You might think atoms behave the same way as    everything else in the world, in their own tiny little waybut    that's not true: on the atomic scale, the rules change and the    \"classical\" laws of physics we take for granted in our everyday    world no longer automatically apply. As Richard P. Feynman, one of the greatest physicists    of the 20th century, once put it: \"Things on a very small scale    behave like nothing you have any direct experience about... or    like anything that you have ever seen.\" (Six Easy Pieces, p116.)  <\/p>\n<p>    If you've studied light, you may    already know a bit about quantum theory. You might know that a    beam of light sometimes behaves as though it's made up of    particles (like a steady stream of cannonballs), and sometimes    as though it's waves of energy    rippling through space (a bit like waves on the sea). That's    called wave-particle duality and it's    one of the ideas that comes to us from quantum theory. It's    hard to grasp that something can be two things at oncea    particle and a wavebecause it's    totally alien to our everyday experience: a car is not    simultaneously a bicycle and a bus. In quantum theory, however,    that's just the kind of crazy thing that can happen. The most    striking example of this is the baffling riddle known as    Schrdinger's cat. Briefly, in the    weird world of quantum theory, we can imagine a situation where    something like a cat could be alive and dead at the same time!  <\/p>\n<p>    What does all this have to do with computers? Suppose we keep    on pushing Moore's Lawkeep on making transistors smaller until    they get to the point where they obey not the ordinary laws of    physics (like old-style transistors) but the more bizarre laws    of quantum mechanics. The question is whether computers    designed this way can do things our conventional computers    can't. If we can predict mathematically that they might be able    to, can we actually make them work like that in practice?  <\/p>\n<p>    People have been asking those questions for several decades.    Among the first were IBM research physicists Rolf    Landauer and Charles H. Bennett. Landauer opened the    door for quantum computing in the 1960s when he proposed that    information is a physical entity that could be manipulated    according to the laws of physics. One important consequence of    this is that computers waste energy manipulating the bits    inside them (which is partly why computers use so much energy    and get so hot, even though they appear to be doing not very    much at all). In the 1970s, building on Landauer's work,    Bennett showed how a computer could circumvent this problem by    working in a \"reversible\" way, implying that a quantum computer    could carry out massively complex computations without using    massive amounts of energy. In 1981, physicist Paul Benioff from    Argonne National Laboratory tried to envisage a basic machine    that would work in a similar way to an ordinary computer but    according to the principles of quantum physics. The following    year, Richard Feynman sketched out roughly how a machine using    quantum principles could carry out basic computations. A few    years later, Oxford University's David Deutsch (one of the    leading lights in quantum computing) outlined the theoretical    basis of a quantum computer in more detail. How did these great    scientists imagine that quantum computers might work?  <\/p>\n<p>    The key features of an ordinary computerbits, registers, logic    gates, algorithms, and so onhave analogous features in a    quantum computer. Instead of bits, a quantum computer has    quantum bits or qubits, which work in    a particularly intriguing way. Where a bit can store either a    zero or a 1, a qubit can store a zero, a one, both zero and    one, or an infinite number of values in betweenand be in    multiple states (store multiple values) at the same time! If    that sounds confusing, think back to light being a particle    and a wave at the same time,    Schrdinger's cat being alive and dead, or a car being a    bicycle and a bus. A gentler way to think of the numbers qubits    store is through the physics concept of superposition (where two waves add to make a    third one that contains both of the originals). If you blow on    something like a flute, the pipe fills up with a standing wave:    a wave made up of a fundamental frequency (the basic note    you're playing) and lots of overtones or harmonics    (higher-frequency multiples of the fundamental). The wave    inside the pipe contains all these waves simultaneously:    they're added together to make a combined wave that includes    them all. Qubits use superposition to represent multiple states    (multiple numeric values) simultaneously in a similar way.  <\/p>\n<p>    Just as a quantum computer can store multiple numbers at once,    so it can process them simultaneously. Instead of working in    serial (doing a series of things one at a time in a sequence),    it can work in parallel (doing multiple things at the same    time). Only when you try to find out what state it's actually    in at any given moment (by measuring it, in other words) does    it \"collapse\" into one of its possible statesand that gives    you the answer to your problem. Estimates suggest a quantum    computer's ability to work in parallel would make it millions    of times faster than any conventional computer... if only we    could build it! So how would we do that?  <\/p>\n<p>    In reality, qubits would have to be stored by atoms, ions (atoms with too many or too few    electrons) or even smaller things such as electrons and photons    (energy packets), so a quantum computer would be almost like a    table-top version of the kind of particle physics experiments    they do at Fermilab or CERN! Now you wouldn't be racing    particles round giant loops and smashing them together, but you    would need mechanisms for containing atoms, ions, or subatomic    particles, for putting them into certain states (so you can    store information), knocking them into other states (so you can    make them process information), and figuring out what their    states are after particular operations have been performed.  <\/p>\n<\/p>\n<p>    Photo: A single atom can be trapped in an    optical cavitythe space between mirrorsand controlled by    precise pulses from laser beams.  <\/p>\n<p>    In practice, there are lots of possible ways of containing    atoms and changing their states using laser beams, electromagnetic fields, radio waves, and an assortment of other    techniques. One method is to make qubits using quantum    dots, which are nanoscopically tiny particles    of semiconductors inside which individual charge carriers,    electrons and holes (missing electrons), can be controlled.    Another method makes qubits from what are called ion traps: you add or take away electrons from an    atom to make an ion, hold it steady in a kind of laser    spotlight (so it's locked in place like a nanoscopic rabbit    dancing in a very bright headlight), and then flip it into    different states with laser pulses. In another technique, the    qubits are photons inside optical    cavities (spaces between extremely tiny mirrors). Don't    worry if you don't understand; not many people do! Since the    entire field of quantum computing is still largely abstract and    theoretical, the only thing we really need to know is that    qubits are stored by atoms or other quantum-scale particles    that can exist in different states and be switched between    them.  <\/p>\n<p>    Although people often assume that quantum computers must    automatically be better than conventional ones, that's by no    means certain. So far, just about the only thing we know for    certain that a quantum computer could do better than a normal    one is factorisation: finding two    unknown prime numbers that, when multiplied together, give a    third, known number. In 1994, while working at Bell    Laboratories, mathematician Peter Shor demonstrated an    algorithm that a quantum computer could follow to find the    \"prime factors\" of a large number, which would speed up the    problem enormously. Shor's algorithm really excited interest in    quantum computing because virtually every modern computer (and    every secure, online shopping and    banking website) uses public-key encryption technology based on the    virtual impossibility of finding    prime factors quickly (it is, in other words, essentially an    \"intractable\" computer problem). If quantum computers could    indeed factor large numbers quickly, today's online security    could be rendered obsolete at a stroke.  <\/p>\n<p>    Does that mean quantum computers are better than conventional    ones? Not exactly. Apart from Shor's algorithm, and a search    method called Grover's algorithm, hardly any other algorithms    have been discovered that would be better performed by quantum    methods. Given enough time and computing power, conventional    computers should still be able to solve any problem that    quantum computers could solve, eventually. In other words, it remains to be    proven that quantum computers are generally superior to conventional ones,    especially given the difficulties of actually building them.    Who knows how conventional computers might advance in the next    50 years, potentially making the idea of quantum computers    irrelevantand even absurd.  <\/p>\n<\/p>\n<p>    Photo: Quantum dots are probably best    known as colorful nanoscale crystals, but they can also be used    as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.  <\/p>\n<p>    Three decades after they were first proposed, quantum computers    remain largely theoretical. Even so, there's been some    encouraging progress toward realizing a quantum machine. There    were two impressive breakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then    working at IBM's Almaden Research Center) used five fluorine    atoms to make a crude, five-qubit quantum computer. The same    year, researchers at Los Alamos National Laboratory figured out    how to make a seven-qubit machine using a drop of liquid. Five    years later, researchers at the University of Innsbruck added    an extra qubit and produced the first quantum computer that    could manipulate a qubyte (eight qubits).  <\/p>\n<p>    These were tentative but important first steps. Over the next    few years, researchers announced more ambitious experiments,    adding progressively greater numbers of qubits. By 2011, a    pioneering Canadian company called D-Wave Systems    announced in Nature that it had produced a 128-qubit    machine. Thee years later, Google announced that it was hiring    a team of academics (including University of California at    Santa Barbara physicist John Martinis) to develop its own    quantum computers based on D-Wave's approach. In March 2015,    the Google team announced they were \"a step closer to    quantum computation,\" having developed a new way for qubits to    detect and protect against errors. In 2016, MIT's Isaac Chang    and scientists from the University of Innsbruck unveiled a five-qubit, ion-trap quantum    computer that could calculate the factors of 15; one day, a    scaled-up version of this machine might evolve into the    long-promised, fully fledged encryption buster! There's no    doubt that these are hugely important advances. Even so, it's    very early days for the whole fieldand most researchers agree    that we're unlikely to see practical quantum computers    appearing for many yearsperhaps even decades.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>View original post here: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"http:\/\/www.explainthatstuff.com\/quantum-computing.html\" title=\"Quantum computing: A simple introduction - Explain that Stuff\">Quantum computing: A simple introduction - Explain that Stuff<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> by Chris Woodford. Last updated: February 18, 2017. How can you get more and more out of less and less <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/quantum-computing\/quantum-computing-a-simple-introduction-explain-that-stuff\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":6,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[257742],"tags":[],"class_list":["post-191558","post","type-post","status-publish","format-standard","hentry","category-quantum-computing"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/191558"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=191558"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/191558\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=191558"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=191558"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=191558"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}