12345...


What is quantum computing? The next era of computational evolution, explained – Digital Trends

When you first stumble across the term quantum computer, you might pass it off as some far-flung science fiction concept rather than a serious current news item.

But with the phrase being thrown around with increasing frequency, its understandable to wonder exactly what quantum computers are, and just as understandable to be at a loss as to where to dive in. Heres the rundown on what quantum computers are, why theres so much buzz around them, and what they might mean for you.

All computing relies on bits, the smallest unit of information that is encoded as an on state or an off state, more commonly referred to as a 1 or a 0, in some physical medium or another.

Most of the time, a bit takes the physical form of an electrical signal traveling over the circuits in the computers motherboard. By stringing multiple bits together, we can represent more complex and useful things like text, music, and more.

The two key differences between quantum bits and classical bits (from the computers we use today) are the physical form the bits take and, correspondingly, the nature of data encoded in them. The electrical bits of a classical computer can only exist in one state at a time, either 1 or 0.

Quantum bits (or qubits) are made of subatomic particles, namely individual photons or electrons. Because these subatomic particles conform more to the rules of quantum mechanics than classical mechanics, they exhibit the bizarre properties of quantum particles. The most salient of these properties for computer scientists is superposition. This is the idea that a particle can exist in multiple states simultaneously, at least until that state is measured and collapses into a single state. By harnessing this superposition property, computer scientists can make qubits encode a 1 and a 0 at the same time.

The other quantum mechanical quirk that makes quantum computers tick is entanglement, a linking of two quantum particles or, in this case, two qubits. When the two particles are entangled, the change in state of one particle will alter the state of its partner in a predictable way, which comes in handy when it comes time to get a quantum computer to calculate the answer to the problem you feed it.

A quantum computers qubits start in their 1-and-0 hybrid state as the computer initially starts crunching through a problem. When the solution is found, the qubits in superposition collapse to the correct orientation of stable 1s and 0s for returning the solution.

Aside from the fact that they are far beyond the reach of all but the most elite research teams (and will likely stay that way for a while), most of us dont have much use for quantum computers. They dont offer any real advantage over classical computers for the kinds of tasks we do most of the time.

However, even the most formidable classical supercomputers have a hard time cracking certain problems due to their inherent computational complexity. This is because some calculations can only be achieved by brute force, guessing until the answer is found. They end up with so many possible solutions that it would take thousands of years for all the worlds supercomputers combined to find the correct one.

The superposition property exhibited by qubits can allow supercomputers to cut this guessing time down precipitously. Classical computings laborious trial-and-error computations can only ever make one guess at a time, while the dual 1-and-0 state of a quantum computers qubits lets it make multiple guesses at the same time.

So, what kind of problems require all this time-consuming guesswork calculation? One example is simulating atomic structures, especially when they interact chemically with those of other atoms. With a quantum computer powering the atomic modeling, researchers in material science could create new compounds for use in engineering and manufacturing. Quantum computers are well suited to simulating similarly intricate systems like economic market forces, astrophysical dynamics, or genetic mutation patterns in organisms, to name only a few.

Amidst all these generally inoffensive applications of this emerging technology, though, there are also some uses of quantum computers that raise serious concerns. By far the most frequently cited harm is the potential for quantum computers to break some of the strongest encryption algorithms currently in use.

In the hands of an aggressive foreign government adversary, quantum computers could compromise a broad swath of otherwise secure internet traffic, leaving sensitive communications susceptible to widespread surveillance. Work is currently being undertaken to mature encryption ciphers based on calculations that are still hard for even quantum computers to do, but they are not all ready for prime-time, or widely adopted at present.

A little over a decade ago, actual fabrication of quantum computers was barely in its incipient stages. Starting in the 2010s, though, development of functioning prototype quantum computers took off. A number of companies have assembled working quantum computers as of a few years ago, with IBM going so far as to allow researchers and hobbyists to run their own programs on it via the cloud.

Despite the strides that companies like IBM have undoubtedly made to build functioning prototypes, quantum computers are still in their infancy. Currently, the quantum computers that research teams have constructed so far require a lot of overhead for executing error correction. For every qubit that actually performs a calculation, there are several dozen whose job it is to compensate for the ones mistake. The aggregate of all these qubits make what is called a logical qubit.

Long story short, industry and academic titans have gotten quantum computers to work, but they do so very inefficiently.

Fierce competition between quantum computer researchers is still raging, between big and small players alike. Among those who have working quantum computers are the traditionally dominant tech companies one would expect: IBM, Intel, Microsoft, and Google.

As exacting and costly of a venture as creating a quantum computer is, there are a surprising number of smaller companies and even startups that are rising to the challenge.

The comparatively lean D-Wave Systems has spurred many advances in the fieldand proved it was not out of contention by answering Googles momentous announcement with news of a huge deal with Los Alamos National Labs. Still, smaller competitors like Rigetti Computing are also in the running for establishing themselves as quantum computing innovators.

Depending on who you ask, youll get a different frontrunner for the most powerful quantum computer. Google certainly made its case recently with its achievement of quantum supremacy, a metric that itself Google more or less devised. Quantum supremacy is the point at which a quantum computer is first able to outperform a classical computer at some computation. Googles Sycamore prototype equipped with 54 qubits was able to break that barrier by zipping through a problem in just under three-and-a-half minutes that would take the mightiest classical supercomputer 10,000 years to churn through.

Not to be outdone, D-Wave boasts that the devices it will soon be supplying to Los Alamos weigh in at 5000 qubits apiece, although it should be noted that the quality of D-Waves qubits has been called into question before. IBM hasnt made the same kind of splash as Google and D-Wave in the last couple of years, but they shouldnt be counted out yet, either, especially considering their track record of slow and steady accomplishments.

Put simply, the race for the worlds most powerful quantum computer is as wide open as it ever was.

The short answer to this is not really, at least for the near-term future. Quantum computers require an immense volume of equipment, and finely tuned environments to operate. The leading architecture requires cooling to mere degrees above absolute zero, meaning they are nowhere near practical for ordinary consumers to ever own.

But as the explosion of cloud computing has proven, you dont need to own a specialized computer to harness its capabilities. As mentioned above, IBM is already offering daring technophiles the chance to run programs on a small subset of its Q System Ones qubits. In time, IBM and its competitors will likely sell compute time on more robust quantum computers for those interested in applying them to otherwise inscrutable problems.

But if you arent researching the kinds of exceptionally tricky problems that quantum computers aim to solve, you probably wont interact with them much. In fact, quantum computers are in some cases worse at the sort of tasks we use computers for every day, purely because quantum computers are so hyper-specialized. Unless you are an academic running the kind of modeling where quantum computing thrives, youll likely never get your hands on one, and never need to.

See the article here:

What is quantum computing? The next era of computational evolution, explained - Digital Trends

Quantum Computing beginning talks with clients on its quantum asset allocation application – Proactive Investors USA & Canada

() Vice President of Product Development Steve Reinhardt tells Proactive the Virginia-based company is beginning conversations with early users of its quantum asset allocation application, and is further refining the tool.

Reinhardt recently attended the Qubits North America Users Conference in Newport, Rhode Island to discuss innovations in quantum computing.

Add related topics to MyProactive

Create your account: sign up and get ahead on news and events

The Company is a publisher. You understand and agree that no content published on the Site constitutes a recommendation that any particular security, portfolio of securities, transaction, or investment strategy is...

In exchange for publishing services rendered by the Company on behalf of Quantum Computing Inc named herein, including the promotion by the Company of Quantum Computing Inc in any Content on the Site, the Company...

See the original post:

Quantum Computing beginning talks with clients on its quantum asset allocation application - Proactive Investors USA & Canada

Detecting Environmental ‘Noise’ That Can Damage The Quantum State of Qubits – In Compliance

Sampson Wilcox

Scientists from MIT and Dartmouth College have created a new type of tool that can detect specific characteristics of environmental noise known for its ability to destroy qubits. This invention could help provide researchers with greater insight into the microscopic noise mechanisms and how they impact the quantum state of qubits, which are a fundamental aspect when constructing quantum computers. By learning new ways to protect qubits, scientists hope to make further advances into the realm of quantum computing.

Qubits generally represent the two states which correspond to the classic binary bits, 0 or a 1. However, qubits also have the ability to maintain both states at the same time, which is called a quantum superposition. This state is required for quantum computers to operate at such fast speeds, performing complicated tasks quickly and easily. As such, it is important that qubits be protected so this state can be achieved and maintained.

Unfortunately, keeping qubits in this state is easier said than done. Their ability to maintain this specific state, referred to as quantum coherence, can easily be disrupted by specific noises. These noises can be the byproduct of heat, control electronics, and even impurities found in the very material the qubits come from. Wherever the noise comes from, it creates a very high risk of causing serious computing errors for researchers.

While scientists have constructed devices that measure these unwanted noises, theyve generally only been able to capture very basic noises from a big number of sources think of it as white noise. Because they are unable to identify specific disruptive patterns, researchers have been unable to determine how the qubit is impacted by any specific noise source or protect them from the damage these noise sources can cause.

Now, researchers have created a device capable of separating specific noises from the general background noise. After that, they relied on signal-processing techniques to reconstruct incredibly detailed facts and figures pertaining to those specific noise signals. These new reconstructions will provide researchers with the ability to craft far more realistic noise models, which would hopefully allow them to develop new ways to protect qubits.

As qubits are developed with fewer and fewer defects, the presence of specific noises instead of the background noises scientists have learned to contend with could increase. That makes this technology even more important to the ongoing development of quantum computing.

Like Loading...

More here:

Detecting Environmental 'Noise' That Can Damage The Quantum State of Qubits - In Compliance

Princeton announces initiative to propel innovations in quantum science and technology – Princeton University

Princeton University has announced the creation of the Princeton Quantum Initiative to foster research and training across the spectrum from fundamental quantum science to its application in areas such as computing, sensing and communications.

The new initiative builds on Princeton's world-renowned expertise in quantum science, the area of physics that describes behaviors at the scale of atoms and electrons. Quantum technologies have the potential to revolutionize areas ranging from secure data transmission to biomedical research, to the discovery of new materials.

Princeton has announced the creation of the Princeton Quantum Initiative, designed to foster research and train scientists and engineers in quantum science and its application in areas such as computing, sensing and communications. Clockwise from left: Research images from Princeton faculty members Julia Mikhailova, assistant professor of mechanical and aerospace engineering; Nathalie de Leon, assistant professor of electrical engineering; Andrew Houck, professor of engineering; Jason Petta, the Eugene Higgins Professor of Physics; Ali Yazdani, the Class of 1909 Professor of Physics; M. Zahid Hasan, the Eugene Higgins Professor of Physics; Jeffrey Thompson, assistant professor of electrical engineering; and Robert Cava, the Russell Wellman Moore Professor of Chemistry.

Images courtesy of the researchers

The inaugural director will be Andrew Houck, professor of electrical engineering and a pioneer in quantum computing technologies. The initiative will bring together over 30 faculty members from departments across campus in the sciences and engineering.

"This initiative enables the work of our extraordinary quantum faculty and their teams to grow research capabilities and attract talented minds at all levels to Princeton, so that they can discover new materials, design new algorithms, and explore the depths of the underlying science in an exciting environment of discovery and innovation," said Dean for Research Pablo Debenedetti, the Class of 1950 Professor in Engineering and Applied Science and professor of chemical and biological engineering.

"The potential benefits to society from quantum information science make this an essential endeavor for Princeton. The initiative will provide tremendous opportunities for Princeton students and postdoctoral researchers to make profound contributions to future technologies," said Deborah Prentice, University provost and the Alexander Stewart 1886 Professor of Psychology and Public Affairs.

The initiative comes at a time of national momentum for quantum sciences at the University, government and industry level. In 2018, the federal government established the National Quantum Initiative to energize research and training in quantum information science and technology. New technologies over the past decade have enabled companies including Google, IBM and others to build research-stage quantum computers.

The Princeton Quantum Initiative will enable new collaborations both across campus and with other universities and industry. Within the University, the initiative will include faculty in the departments of electrical engineering, physics, chemistry, computer science and mechanical and aerospace engineering.

"Princeton has world leaders at all layers of this technology, including foundational science, materials synthesis and characterization, quantum device platforms, computer architecture, algorithm design and computational complexity," said Houck. "We have an incredible collection of experts in their respective disciplines, and the Princeton Quantum Initiative gives us an entity which brings everyone together to accelerate the pace of discovery."

The Princeton Quantum Initiative will support research across a range of areas, including quantum computing using silicon spin qubits in the laboratory of Jason Petta, the Eugene Higgins Professor of Physics.

Image by Emily Edwards, University of Maryland

To support the future of quantum research, the initiative will train a new generation of quantum scientists and engineers through financial support for graduate students and postdoctoral researchers. Annually, Princeton will award two prestigious graduate student fellowships, each providing support for three years, as well as two postdoctoral fellowships for three-year terms, with fellows able to choose projects and faculty mentors.

For undergraduates, the initiative will build on Princeton's leadership in the development of courses whose target audience includes those with no prior quantum physics background. The initiative will help coordinate teaching efforts across departments, offer more cohesive and wide-ranging instruction in quantum science and engineering, and provide undergraduates with opportunities to work on faculty-led projects.

The research supported through the initiative will span areas from new materials science for quantum devices to quantum computer architecture, algorithm design and computational complexity.

Quantum science promises to deliver dramatic enhancements in information processing and communications. Computers built on quantum principles can solve problems that are impossible with today's machines, potentially leading to discoveries in fields such as chemistry, materials science, optimization and information security.

Sensors based on quantum approaches can probe materials and biological systems at the nanoscale with unprecedented precision and resolution. Such sensors could detect medical conditions or be used for quality control in manufacturing of sensitive electronic equipment.

Quantum communication systems can provide provably secure communication that cannot be hacked without detection. Quantum encryption could someday replace today's internet security algorithms to ensure privacy of data transmissions.

Princeton has a long history of contributing foundational discoveries in quantum science. Over the decades, Princeton researchers have made major contributions to quantum theory and trained graduate students that have become leading quantum scientists and technologists. More on the research expertise of Princeton's quantum scientists and engineers is available online.

See more here:

Princeton announces initiative to propel innovations in quantum science and technology - Princeton University

Moore’s Law Is Dying. This Brain-Inspired Analogue Chip Is a Glimpse of What’s Next – Singularity Hub

Dark silicon sounds like a magical artifact out of a fantasy novel. In reality, its one branch of a three-headed beast that foretells the end of advances in computation.

Okthat might be too dramatic. But the looming problems in silicon-based computer chips are very real. Although computational power has exploded exponentially in the past five decades, weve begun hitting some intractable limits in further growth, both in terms of physics and economics.

Moores Law is dying. And chipmakers around the globe are asking, now what?

One idea is to bet on quantum computers, which tap into the ultra-weird world of quantum mechanics. Rather than operating on binaries of 0s and 1s, qubits can simultaneously represent both states, with each having a different probability and thus much higher information density.

Another idea is to look inside our heads: the quantum realm isnt the only way to get past binary computation. Our brains also operate on probabilities, making them a tangible source of inspiration to overhaul the entire computational world.

This week, a team from Pennsylvania State University designed a 2D device that operates like neurons. Rather than processing yes or no, the Gaussian synapse thrives on probabilities. Similar to the brain, the analogue chip is far more energy-efficient and produces less heat than current silicon chips, making it an ideal candidate for scaling up systems.

In a proof-of-concept test, the team used a simulated chip to analyze EEG (electroencephalography) signals taken from either wakeful or sleeping people. Without extensive training, the chip was able to determine if the subject was sleeping.

Combined, these new developments can facilitate exascale computing and ultimately benefit scientific discovery, national security, energy security, economic security, infrastructure development, and advanced healthcare programs, the team concluded.

With new iPhones every year and increasingly sophisticated processors, it certainly doesnt feel like were pushing the limits of silicon-based computing. But according to lead study author Dr. Saptarshi Das, the ability to further scale traditional computation is dying in three different aspects: energy, size, and complexity.

Energy scaling helps ensure a practically constant computational power budget, explained Das. But it came to an end around 2005 because of hard limits in the silicon chips thermodynamic propertiessomething scientists dub the Boltzmann tyranny (gotta love these names!). Size scaling, which packs more transistors onto the same chip area, soon followed suit, ending in 2017 because quantum mechanics imposes limitations at the materials level of traditional chips.

The third, complexity scaling, is still hanging on but on the decline. Fundamentally, explained the team, this is because of the traditional von Neumann architecture that most modern computers use, which rely on digital, binary computation. In addition, current computers store logic and memory units separately and have to operate sequentially, which increases delay and energy consumption. As more transistors are jam-packed onto the same chip and multiple cores are linked together into processors, eventually the energy needs and cooling requirements will hit a wall.

This is the Dark Silicon era. Because too much heat is given out, a large amount of transistors on a single chip cant be powered up at once without causing heat damage. This limitation requires a portion of computing components on a chip to be kept powered offkept darkat any instant, which severely limits computational power. Tinkering with variables such as how to link up transistors may optimize efficacy, but ultimately its a band-aid, not a cure.

In contrast, the brain deploys billions of information processing units, neurons, which are connected via trillions of synapses in order to accomplish massively parallel, synchronous, coherent, and concurrent computation, the team said. Thats our roadmap ahead.

Although there are plenty of neuromorphic chipsdevices that mimic the structure or functionality of neurons and synapsesthe team took a slightly different approach. They focused on recreating a type of artificial neural network called a probabilistic neural network (PNN) in hardware form.

PNNs have been around since the 60s as software, and theyre often used for classification problems. The mathematical heart of PNNs differs from most of the deep learning models used today, but the structure is relatively similar. A PNN generally has four layers, and raw data travels from the first layer to the last. The two middle layers, pattern and summation, process the data in a way that allows the last layer to make a voteit selects the best answer from a group of potential probable ones.

To implement PNNs directly in hardware form, the team engineered a Gaussian synapse made of two different materials: MoS2 and black phosphorus. Each represents a transistor, and is linked in series on a single synapse. The way the two transistors talk to each other isnt linear. When the MoS2 component switches on, the electrical current rises exponentially until it reaches a max level, then it drops. The connection strength is like a bell-shaped curveor in mathematical lingo, a Gaussian distribution widely used in probabilities (and where the device gets its name).

How each component turns on or off can be tweaked, which in turn controls communication between the transistors. This, in turn, mimics the inner workings of PNNs, said study author Amritanand Sebastian.

As a proof of concept, the team decided to give back to neuroscience. The brain generates electrical waves that can be picked up by electrodes on top of the scalp. Brain waves are terribly complicated data to process, said the team, and artificial neural networks running on traditional computers generally have a hard time sorting through them.

The team fed their Gaussian synapse recordings from 10 whole nights from 10 subjects, with 32 channels for each individual. The PNN rapidly recognized different brainwave components, and were especially good at picking out the frequencies commonly seen in sleep.

We dont need as extensive a training period or base of information for a probabilistic neural network as we need for an artificial neural network, said Das.

Thanks to quirks in the transistors materials, the chip had some enviable properties. For one, it was exceedingly low-power. To analyze 8 hours of EEG data, it consumed up to only 350 microwatts; to put this into perspective, the human brain generally runs on about 20 watts. This means that the Gaussian synapse facilitates energy scaling, explained Sebastian.

For another, the materials allow size scaling without losing their inherent electrical properties. Finally, the use of PNNs also solves the complexity scaling problem, because it can process non-linear decisions using fewer components than traditional artificial neural networks.

It doesnt mean that weve slayed the three-headed beast, at least not yet. But looking ahead, the team believes their results can further inspire more ultra-low power devices to tackle the future of computation.

Our experimental demonstration of Gaussian synapses uses only two transistors, which significantly improves the area and energy efficiency at the device level and provides cascading benefits at the circuit, architecture, and system levels. This will stimulate the much-needed interest in the hardware implementation of PNNs for a wide range of pattern classification problems, the authors concluded.

Image Credit: Photo byUmbertoonUnsplash

The rest is here:

Moore's Law Is Dying. This Brain-Inspired Analogue Chip Is a Glimpse of What's Next - Singularity Hub

Experts Gather at Fermilab for International Workshop on Cryogenic Electronics for Quantum Systems – Quantaneo, the Quantum Computing Source

Leaders in quantum science converged this summer at Fermilab for the worlds first workshop on cryogenic electronics for quantum systems. As these fields are highly competitive, the hosts worked hard to attract key global allies and leaders in the field.

Scientists and engineers from academia and industry discussed the challenges of designing electronics for processors and sensors that will work in the ultracold environment.

Its a fundamental problem facing the field of quantum computing, which holds immense possibility across multiple disciplines. Experts say that quantum computers could someday be powerful enough to solve problems that are impossible for classical computers, potentially redefining how we see the world.

And much of it rides on designing electronics that are up to the task.

Quantum systems wont exist without cryogenic electronics, said Fermilab engineer Farah Fahim, workshop co-organizer and deputy head of quantum science at Fermilab. Thats why our community needs to collaborate, and why were working to establish key partnerships with academia and industry, as well as manufacturing companies that would support the fabrication of cold chips.

Researchers across multiple sectors have called for collaboration, and pioneers in the field turned out for the meeting. They included Edoardo Charbon (also workshop co-organizer) of the Advanced Quantum Architecture Lab at the Swiss Federal Institute of Technology Lausanne, or EPFL, in Switzerland and Andrew Dzurak of the University of New South Wales and Australia, a trailblazer in the field of silicon-based qubits who gave the workshops keynote address. Representatives from IBM, Intel, Global Foundries, Google and Microsoft also attended.

The Fermilab cryoelectronics workshop is a very important first step for the quantum computing community, said Malcolm Carroll, research staff member at IBM Research. Developing supporting electronics for future quantum computers is one of the next big hurdles. IBM looks forward to this series continuing and contributing to it as it has for this first one.

The global cooling effort centers on accommodating the qubit the fundamental unit of a quantum computers processor. Qubit information needs the extreme cold to survive below 15 millikelvins since any thermal energy can disturb the quantum computing operation.

The core of any quantum-technology-based system is a very special and carefully designed electronics optimized for deep cryogenic temperatures. This is a brave new world for us electronics engineers.

Current state-of-the-art systems use tens of qubits. But a quantum computer that surpasses the capabilities of todays classical computer would in certain cases require millions or billions of qubits, each of which needs electronics, both to control the state of the qubit and to read out its signals.

And electronics means cables.

As the system scales up, one bottleneck has been getting information out of the qubits and controlling the qubits themselves, Fahim said. It requires large numbers of wires.

For larger systems, the qubits and the electronics need to be closely integrated. Otherwise, information can become degraded as it winds its way down lengthy wires to bulky systems. With tight integration, the electronics can deliver the fast, self-correcting feedback required to control the qubit state on the order of ten-billionths of a second.

When you have the number of wires and cables required for a million- or billion-qubit system, close integration isnt possible unless your electronics can operate in the cold, side-by-side with the qubit.

Fermilab engineer Farah Fahim, left, and Edoardo Charbon of the Advanced Quantum Architecture Lab at EPFL co-organized the worlds first workshop on cryogenic electronics for quantum systems. Photo: Davide Braga

When you have lots of cables, after some point, you cant expand in that direction anymore. You cant integrate a million cold qubits with warm electronics, Fahim said. To scale up, cryogenic electronics is the only way to go. To be able to take it to the next level of integration, we need to move the room temperature control to cryogenic control. You want to be able to change the technology to meet the requirements.

When the electronics live in the same space the same refrigerated space as the qubits, the system becomes practical and manageable, capable of providing accurate, real-time qubit control.

That is the challenge the workshop attendees took head-on: developing quantum-system electronics that dont mind being left in the cold.

Developments in cold electronics may hold the keys to scaling up quantum computing, said Microsoft Quantum Sydney Director David Reilly, also a professor at the University of Sydney. As the community moves from the demonstration of single-qubit prototypes to scaled up machines that can address real problems, interest in this field is really taking off. Fermilab has deep expertise in cold electronics as well as a culture of filling the gap between academia and industry. Its only fitting that the first workshop on this topic was at Fermilab and I expect to see many more as government labs become pivotal players in the quantum ecosystem.

Experts dream of a day when quantum computers can get out of the cold and sit comfortably atop your desk just like your current PC.

We would like to reach a stage where nothing is cryocooled, but until we get there, the only way we get there is with electronics operating at very low temperatures, Fahim said.

The workshop was a major, international step in that direction.

Quantum technologies are the next frontier for many fields, including electronics. While quantum computers are certainly the pinnacle of such worldwide effort, many other applications are emerging, like quantum imaging, quantum sensing, quantum communications, quantum metrology, to name just a few, Charbon said. But the core of any quantum-technology-based system is a very special and carefully designed electronics optimized for deep cryogenic temperatures. This is a brave new world for us electronics engineers.

To continue the dialogue on this key enabling technology, the second International Workshop on Cryogenic Electronics for Quantum Systems will be held in Neuchatel, Switzerland in 2020.

This work is supported by the DOE Office of Science.

Read the original post:

Experts Gather at Fermilab for International Workshop on Cryogenic Electronics for Quantum Systems - Quantaneo, the Quantum Computing Source

The five pillars of Edge Computing — and what is Edge computing anyway? – Information Age

Why do we need Edge computing? What is it? What are the advantages? The five pillars to edge computing provide the answers.

The five pillars of Edge computing: latency, bandwidth, computing power, resilience and privacy,

It partly boils down to the explosion of devices. Joe Root, co-founder of Edge Computing Company, Permutive, which provides a data management platform built for purpose, told Information Age that, over the past 10 to 20 years, weve seen this explosion in the number of devices. So weve gone from 500 million to 15 billion devices, but over the next 10 years will go from 15 billion to a trillion devices.

These devices generate enormous volumes of data anything from my watch, which knows my heart rate constantly, all day long, all the way through to smart cities and factories, the data is being centralised on the cloud. This explosion of data causes problems for the way in which we currently process data.

So, Edge computing helps solve this, by providing computational power, in the form of local computing power, such as smart phones, on the edge.

Joe spoke to us about what he calls the five pillars of Edge Computing.

The cloud is at a distance. It takes time for data to be transferred from the cloud to the point where you need it. If you make a network request, it takes time to download the information you need. At the moment, says Root, this is around 100 milliseconds. This will come down with 5G, but no matter how advanced the technology there will always be a time lag involved in pulling data from the cloud defined by the speed of light. The closer you are to the data, the less the latency. Sure, the speed of light might be 300 million metres a second, or a metre every 3 nanoseconds, but when you are processing millions of pieces of data, pulling it from sources that may be many miles away, those nanoseconds add up. If you need to make a decision in milliseconds, actually, physics prevents that from being possible, explained Root. I think were seeing companies approach this in different ways if you get the CDN (content delivery network) or if you have 5G, then theyre trained to move the processing closer to the user to minimise that network latency. The most extreme way to do that though is to do it on the device. So that is why Apple face ID, for example, doesnt rely on the cloud, because milliseconds matter in that example.

All large organisations require true data leadership to succeed today. Increasingly, that means using automation tools like AI and machine learning

The second pillar is the limitation imposed on the amount of data you can access rapidly from the cloud imposed by bandwidth. Due to limitations of physics, you can only send a certain amount of data from the network before it sees the bandwidth limit and you cant send it. So, the second benefit of edge computing is that you arent constrained by the bandwidth, because youre processing the data on the device, you dont need to transfer it.

Whether you think 5G is hype or not, Gartner forecasts that worldwide 5G network infrastructure revenue will reach $4.2 billion in 2020

The explosion in the number of devices on the internet comes into play here. There is enormous computing power residing on the internet, which has already been funded. Furthermore, much of this computational power is under-utilised. Edge computing takes advantage of that processing power, meaning you dont have to pay for this computation in the cloud.

One day, quantum computers may change this relationship the computing power in the cloud will exceed that which exists on the edge. But that is some way off. In any case, even in the distant time when quantum computing provides superior processing power on the cloud, this is just one of the five pillars of Edge Computing.

Organisations are beginning to realise that they need unwavering control over every aspect of their business to drive digital transformation. How they do this? At the edge

What happens if the cloud goes down, or you lose your connection? It is something we all intuitively understand we may download a document to read on our computer or smart phone, so that if we are travelling and internet connection is intermittent, we can still read the document.

When data is stored on the cloud, we have less control over it. Our wearable health checker may record all kinds of information about us, but its our personal information, which we want to keep private and storing it locally and not allowing this data to seep onto the cloud is a good way to achieve this.

Indeed Root speculates that the greatest breach of all time is ongoing open RTB, (real time bidding), a protocol concerning programmatic advertising, in which advertisers can bid to have access to data about you. GDPR is shining a light on this, but from the point of view of individuals concerned about their privacy, Edge Computing could overcome this problem in one foul swoop. It doesnt mean advertisers will no longer be able to target specific customers accurately, but such targeting will be permission based and transparent.

Root claims that this is a data breach that is happening millions of times a second, right now.

Digital self defense does not mean MI5 agents. Nor are those who practice it unified by a single ideology. Theyre fed up with the pervasive online behavioural tracking that now follows them into their offline lives.

The advantages of cloud computing are well known you can turn your cloud computing up and down when you need it no need for startups, for example, to invest in expensive IT infrastructure you pay as you go.

But the limitations of the cloud are well known too latency, bandwidth, diluted processing power, resilience in the event of lost connection and privacy.

Edge computing, by taking advantage of hardware that has already been funded, can overcome many of those disadvantages without necessarily losing the flexibility of the cloud.

Its not that Edge Computing makes the cloud redundant but it does make it, well, and sincere apologies for the obvious pun, it makes it more edgy.

Read more:

The five pillars of Edge Computing -- and what is Edge computing anyway? - Information Age

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Big things happen when computers get smaller. Or faster. And quantum computing is about chasing perhaps the biggest performance boost in the history of technology. The basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales.

If the tech industry pulls off that, ahem, quantum leap, you wont be getting a quantum computer for your pocket. Dont start saving for an iPhone Q. We could, however, see significant improvements in many areas of science and technology, such as longer-lasting batteries for electric cars or advances in chemistry that reshape industries or enable new medical treatments. Quantum computers wont be able to do everything faster than conventional computers, but on some tricky problems they have advantages that would enable astounding progress.

Its not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real. The only thing for sure is that they are still many years away. Prototype quantum computing hardware is still embryonic. But powerfuland, for tech companies, profit-increasingcomputers powered by quantum physics have recently started to feel less hypothetical.

The cooling and support structure for one of IBM's quantum computing chips (the tiny black square at the bottom of the image).

Amy Lombard

Thats because Google, IBM, and others have decided its time to invest heavily in the technology, which, in turn, has helped quantum computing earn a bullet point on the corporate strategy PowerPoint slides of big companies in areas such as finance, like JPMorgan, and aerospace, like Airbus. In 2017, venture investors plowed $241 million into startups working on quantum computing hardware or software worldwide, according to CB Insights. Thats triple the amount in the previous year.

Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, its now. Say Schrodingers superposition three times fast, and we can dive in.

The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didnt just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didnt really exist until they were observed.

Physicist Paul Benioff suggests quantum mechanics could be used for computation.

Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computers power to break widely used forms of encryption.

D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the companys technology really works.

Google teams up with NASA to fund a lab to try out D-Waves hardware.

Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.

If you find that baffling, youre in good company. A year before winning a Nobel for his contributions to quantum theory, Caltechs Richard Feynman remarked that nobody understands quantum mechanics. The way we experience the world just isnt compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of themincluding Feynmanbegan to wonder if quantum phenomena like subatomic particles' dont look and I dont exist trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the worlds fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examplesat least among a very select slice of humanityinclude superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.

The looped cables connect the chip at the bottom of the structure to its control system.

Amy Lombard

You may have heard that a qubit in superposition is both 0 and 1 at the same time. Thats not quite true and also not quite falsetheres just no equivalent in Homo sapiens humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read outan operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.

It's the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.

A quantum effect so unintuitive that Einstein dubbed it spooky action at a distance. When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

The holy grail of quantum computinga measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers arent well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.

For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grovers algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grovers and some other quantum algorithms, the bigger the initial problemor phonebookthe further behind a conventional computer is left in the digital dust.

The reason we dont have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processors power to correct its own errors, caused by misfiring qubits.

Recent excitement about quantum computing stems from progress in making qubits less flaky. Thats giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. Thats still far fewer than would be needed to do useful work with a quantum computerit would probably require at least thousandsbut as recently as 2016 those companies best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

Some large companies and governments have started treating quantum computing research like a raceperhaps fittingly its one where both the distance to the finish line and the prize for getting there are unknown.

Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. Its not quite clear what the first killer apps of quantum computing will be, or when they will appear. But theres a sense that whoever is first make these machines useful will gain big economic and national security advantages.

Copper structures conduct heat well and connect the apparatus to its cooling system.

Amy Lombard

Back in the world of right now, though, quantum processors are too simple to do practical work. Google is working to stage a demonstration known as quantum supremacy, in which a quantum processor would solve a carefully designed math problem beyond existing supercomputers. But that would be an historic scientific milestone, not proof quantum computing is ready to do real work.

As quantum computer prototypes get larger, the first practical use for them will probably be for chemistry simulations. Computer models of molecules and atoms are vital to the hunt for new drugs or materials. Yet conventional computers cant accurately simulate the behavior of atoms and electrons during chemical reactions. Why? Because that behavior is driven by quantum mechanics, the full complexity of which is too great for conventional machines. Daimler and Volkswagen have both started investigating quantum computing as a way to improve battery chemistry for electric vehicles. Microsoft says other uses could include designing new catalysts to make industrial processes less energy intensive, or even to pull carbon dioxide out of the atmosphere to mitigate climate change.

Quantum computers would also be a natural fit for code-breaking. Weve known since the 90s that they could zip through the math underpinning the encryption that secures online banking, flirting, and shopping. Quantum processors would need to be much more advanced to do this, but governments and companies are taking the threat seriously. The National Institute of Standards and Technology is in the process of evaluating new encryption systems that could be rolled out to quantum-proof the internet.

When cooled to operating temperature, the whole assembly is hidden inside this white insulated casing.

Amy Lombard

Tech companies such as Google are also betting that quantum computers can make artificial intelligence more powerful. Thats further in the future and less well mapped out than chemistry or code-breaking applications, but researchers argue they can figure out the details down the line as they play around with larger and larger quantum processors. One hope is that quantum computers could help machine-learning algorithms pick up complex tasks using many fewer than the millions of examples typically used to train AI systems today.

Despite all the superposition-like uncertainty about when the quantum computing era will really begin, big tech companies argue that programmers need to get ready now. Google, IBM, and Microsoft have all released open source tools to help coders familiarize themselves with writing programs for quantum hardware. IBM has even begun to offer online access to some of its quantum processors, so anyone can experiment with them. Long term, the big computing companies see themselves making money by charging corporations to access data centers packed with supercooled quantum processors.

Whats in it for the rest of us? Despite some definite drawbacks, the age of conventional computers has helped make life safer, richer, and more convenientmany of us are never more than five seconds away from a kitten video. The era of quantum computers should have similarly broad reaching, beneficial, and impossible to predict consequences. Bring on the qubits.

The Quantum Computing Factory Thats Taking on Google and IBMPeek inside the ultra-clean workshop of Rigetti Computing, a startup packed with PhDs wearing what look like space suits and gleaming steampunk-style machines studded with bolts. In a facility across the San Francisco Bay from Silicon Valley, Rigetti is building its own quantum processors, using similar technology to that used by IBM and Google.

Why JP Morgan, Daimler Are Testing Quantum Computers That Arent Useful YetWall Street has plenty of quantsmath wizards who hunt profits using equations. Now JP Morgan has quantum quants, a small team collaborating with IBM to figure out how to use the power of quantum algorithms to more accurately model financial risk. Useful quantum computers are still years away, but the bank and other big corporations say that the potential payoffs are so large that they need to seriously investigate quantum computing today.

The Era of Quantum Computing is Here. Outlook: CloudyCompanies working on quantum computer hardware like to say that the field has transitioned from the exploration and uncertainty of science into the more predictable realm of engineering. Yet while hardware has improved markedly in recent years, and investment is surging, there are still open scientific questions about the physics underlying quantum computing.

Quantum Computing Will Create Jobs. But Which Ones?You cant create a new industry without people to staff the jobs it creates. A Congressional bill called the National Quantum Initiative seeks to have the US government invest in training the next generation of quantum computer technicians, designers, and entrepreneurs.

Job One For Quantum Computers: Boost Artificial IntelligenceArtificial intelligence and quantum computing are two of Silicon Valleys favorite buzzwords. If they can be successfully combined, machines will get a lot smarter.

Loopholes and the Anti-Realism Of the Quantum WorldEven people who can follow the math of quantum mechanics find its implications for reality perplexing. This book excerpt explains why quantum physics undermines our understanding of reality with nary an equation in sight.

Quantum Computing is the Next Security Big Security RiskIn 1994, mathematician Peter Shor wrote an algorithm that would allow a quantum computer to pierce the encryption that today underpins online shopping and other digital. As quantum computers get closer to reality, congressman Will Hurd (R-Texas) argues the US needs to lead a global effort to deploy new forms of quantum-resistant encryption.

This guide was last updated on August 24, 2018.

Enjoyed this deep dive? Check out more WIRED Guides.

See the original post here:

What Is Quantum Computing? The Complete WIRED Guide | WIRED

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Big things happen when computers get smaller. Or faster. And quantum computing is about chasing perhaps the biggest performance boost in the history of technology. The basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales.

If the tech industry pulls off that, ahem, quantum leap, you wont be getting a quantum computer for your pocket. Dont start saving for an iPhone Q. We could, however, see significant improvements in many areas of science and technology, such as longer-lasting batteries for electric cars or advances in chemistry that reshape industries or enable new medical treatments. Quantum computers wont be able to do everything faster than conventional computers, but on some tricky problems they have advantages that would enable astounding progress.

Its not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real. The only thing for sure is that they are still many years away. Prototype quantum computing hardware is still embryonic. But powerfuland, for tech companies, profit-increasingcomputers powered by quantum physics have recently started to feel less hypothetical.

The cooling and support structure for one of IBM's quantum computing chips (the tiny black square at the bottom of the image).

Amy Lombard

Thats because Google, IBM, and others have decided its time to invest heavily in the technology, which, in turn, has helped quantum computing earn a bullet point on the corporate strategy PowerPoint slides of big companies in areas such as finance, like JPMorgan, and aerospace, like Airbus. In 2017, venture investors plowed $241 million into startups working on quantum computing hardware or software worldwide, according to CB Insights. Thats triple the amount in the previous year.

Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, its now. Say Schrodingers superposition three times fast, and we can dive in.

The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didnt just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didnt really exist until they were observed.

Physicist Paul Benioff suggests quantum mechanics could be used for computation.

Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computers power to break widely used forms of encryption.

D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the companys technology really works.

Google teams up with NASA to fund a lab to try out D-Waves hardware.

Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.

If you find that baffling, youre in good company. A year before winning a Nobel for his contributions to quantum theory, Caltechs Richard Feynman remarked that nobody understands quantum mechanics. The way we experience the world just isnt compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of themincluding Feynmanbegan to wonder if quantum phenomena like subatomic particles' dont look and I dont exist trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the worlds fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examplesat least among a very select slice of humanityinclude superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.

The looped cables connect the chip at the bottom of the structure to its control system.

Amy Lombard

You may have heard that a qubit in superposition is both 0 and 1 at the same time. Thats not quite true and also not quite falsetheres just no equivalent in Homo sapiens humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read outan operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.

It's the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.

A quantum effect so unintuitive that Einstein dubbed it spooky action at a distance. When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

The holy grail of quantum computinga measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers arent well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.

For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grovers algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grovers and some other quantum algorithms, the bigger the initial problemor phonebookthe further behind a conventional computer is left in the digital dust.

The reason we dont have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processors power to correct its own errors, caused by misfiring qubits.

Recent excitement about quantum computing stems from progress in making qubits less flaky. Thats giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. Thats still far fewer than would be needed to do useful work with a quantum computerit would probably require at least thousandsbut as recently as 2016 those companies best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

Some large companies and governments have started treating quantum computing research like a raceperhaps fittingly its one where both the distance to the finish line and the prize for getting there are unknown.

Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. Its not quite clear what the first killer apps of quantum computing will be, or when they will appear. But theres a sense that whoever is first make these machines useful will gain big economic and national security advantages.

Copper structures conduct heat well and connect the apparatus to its cooling system.

Amy Lombard

Back in the world of right now, though, quantum processors are too simple to do practical work. Google is working to stage a demonstration known as quantum supremacy, in which a quantum processor would solve a carefully designed math problem beyond existing supercomputers. But that would be an historic scientific milestone, not proof quantum computing is ready to do real work.

As quantum computer prototypes get larger, the first practical use for them will probably be for chemistry simulations. Computer models of molecules and atoms are vital to the hunt for new drugs or materials. Yet conventional computers cant accurately simulate the behavior of atoms and electrons during chemical reactions. Why? Because that behavior is driven by quantum mechanics, the full complexity of which is too great for conventional machines. Daimler and Volkswagen have both started investigating quantum computing as a way to improve battery chemistry for electric vehicles. Microsoft says other uses could include designing new catalysts to make industrial processes less energy intensive, or even to pull carbon dioxide out of the atmosphere to mitigate climate change.

Quantum computers would also be a natural fit for code-breaking. Weve known since the 90s that they could zip through the math underpinning the encryption that secures online banking, flirting, and shopping. Quantum processors would need to be much more advanced to do this, but governments and companies are taking the threat seriously. The National Institute of Standards and Technology is in the process of evaluating new encryption systems that could be rolled out to quantum-proof the internet.

When cooled to operating temperature, the whole assembly is hidden inside this white insulated casing.

Amy Lombard

Tech companies such as Google are also betting that quantum computers can make artificial intelligence more powerful. Thats further in the future and less well mapped out than chemistry or code-breaking applications, but researchers argue they can figure out the details down the line as they play around with larger and larger quantum processors. One hope is that quantum computers could help machine-learning algorithms pick up complex tasks using many fewer than the millions of examples typically used to train AI systems today.

Despite all the superposition-like uncertainty about when the quantum computing era will really begin, big tech companies argue that programmers need to get ready now. Google, IBM, and Microsoft have all released open source tools to help coders familiarize themselves with writing programs for quantum hardware. IBM has even begun to offer online access to some of its quantum processors, so anyone can experiment with them. Long term, the big computing companies see themselves making money by charging corporations to access data centers packed with supercooled quantum processors.

Whats in it for the rest of us? Despite some definite drawbacks, the age of conventional computers has helped make life safer, richer, and more convenientmany of us are never more than five seconds away from a kitten video. The era of quantum computers should have similarly broad reaching, beneficial, and impossible to predict consequences. Bring on the qubits.

The Quantum Computing Factory Thats Taking on Google and IBMPeek inside the ultra-clean workshop of Rigetti Computing, a startup packed with PhDs wearing what look like space suits and gleaming steampunk-style machines studded with bolts. In a facility across the San Francisco Bay from Silicon Valley, Rigetti is building its own quantum processors, using similar technology to that used by IBM and Google.

Why JP Morgan, Daimler Are Testing Quantum Computers That Arent Useful YetWall Street has plenty of quantsmath wizards who hunt profits using equations. Now JP Morgan has quantum quants, a small team collaborating with IBM to figure out how to use the power of quantum algorithms to more accurately model financial risk. Useful quantum computers are still years away, but the bank and other big corporations say that the potential payoffs are so large that they need to seriously investigate quantum computing today.

The Era of Quantum Computing is Here. Outlook: CloudyCompanies working on quantum computer hardware like to say that the field has transitioned from the exploration and uncertainty of science into the more predictable realm of engineering. Yet while hardware has improved markedly in recent years, and investment is surging, there are still open scientific questions about the physics underlying quantum computing.

Quantum Computing Will Create Jobs. But Which Ones?You cant create a new industry without people to staff the jobs it creates. A Congressional bill called the National Quantum Initiative seeks to have the US government invest in training the next generation of quantum computer technicians, designers, and entrepreneurs.

Job One For Quantum Computers: Boost Artificial IntelligenceArtificial intelligence and quantum computing are two of Silicon Valleys favorite buzzwords. If they can be successfully combined, machines will get a lot smarter.

Loopholes and the Anti-Realism Of the Quantum WorldEven people who can follow the math of quantum mechanics find its implications for reality perplexing. This book excerpt explains why quantum physics undermines our understanding of reality with nary an equation in sight.

Quantum Computing is the Next Security Big Security RiskIn 1994, mathematician Peter Shor wrote an algorithm that would allow a quantum computer to pierce the encryption that today underpins online shopping and other digital. As quantum computers get closer to reality, congressman Will Hurd (R-Texas) argues the US needs to lead a global effort to deploy new forms of quantum-resistant encryption.

This guide was last updated on August 24, 2018.

Enjoyed this deep dive? Check out more WIRED Guides.

Read the original post:

What Is Quantum Computing? The Complete WIRED Guide | WIRED

Microsoft will open-source parts of Q#, the programming …

Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo)

Much has been made of Microsofts reinvention as an open-source company, and it will continue to live up to that billing Monday at Microsoft Build as the world prepares for quantum computing.

Microsoft plans to open-source the Q# compiler and quantum simulators that it includes as part of its quantum development kit sometime in the near future, the company plans to announce Monday at Build. The idea is to help researchers and universities studying quantum computing have deeper access to these tools in order to help contribute to their development and understanding of quantum technology, the company said in materials provided ahead of Build.

Quantum computing is still pretty far off in the future, but one day it is expected to allow computer scientists to bypass the limits of so-called classical computing to reach new levels of performance. Todays computers represent information using an amazingly complex string of 0s and 1s to represent data, but quantum computers will be able to use more than two states to represent data.

There are lots of different routes to quantum computing, and Microsoft is pursuing a distinct vision thats unique compared to some of the others chasing this grail. Q# is a big part of this approach, because while building a viable quantum computer is hard enough, programming one is going to require a new way of looking at the world.

Open-sourcing the compiler which takes code written by developers in a programming language and makes it run on a computer could help budding quantum developers better understand how to write more efficient code and reduce errors preventing their applications from running. And open-source simulators could make it easier for developers to test their quantum applications before letting them fly on quantum machines, which are likely to be pretty expensive in their early days.

Microsoft is expected to provide more information about its open-source quantum projects this week at Build, where more than 6,000 people are expected to attend to hear details about a lot of Microsofts current projects.

Link:

Microsoft will open-source parts of Q#, the programming ...

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

Go here to read the rest:

Quantum Computing | D-Wave Systems

Scientists Say New Quantum Material Could “‘Download’ Your Brain”

A new type of quantum material can directly measure neural activity and translate it into electrical signals for a computer.

Computer Brain

Scientists say they’ve developed a new “quantum material” that could one day transfer information directly from human brains to a computer.

The research is in early stages, but it invokes ideas like uploading brains to the cloud or hooking people up to a computer to track deep health metrics — concepts that until now existed solely in science fiction.

Quantum Interface

The new quantum material, described in research published Wednesday in the journal Nature Communications, is a “nickelate lattice” that the scientists say could directly translate the brain’s electrochemical signals into electrical activity that could be interpreted by a computer.

“We can confidently say that this material is a potential pathway to building a computing device that would store and transfer memories,” Purdue University engineer Shriram Ramanathan told ScienceBlog.

Running Diagnostics

Right now, the new material can only detect the activity of some neurotransmitters — so we can’t yet upload a whole brain or anything like that. But if the tech progresses, the researchers hypothesize that it could be used to detect neurological diseases, or perhaps even store memories.

“Imagine putting an electronic device in the brain, so that when natural brain functions start deteriorating, a person could still retrieve memories from that device,” Ramanathan said.

READ MORE: New Quantum Material Could Warn Of Neurological Disease [ScienceBlog]

More on brain-computer interface: This Neural Implant Accesses Your Brain Through the Jugular Vein

The post Scientists Say New Quantum Material Could “‘Download’ Your Brain” appeared first on Futurism.

See more here:

Scientists Say New Quantum Material Could “‘Download’ Your Brain”

Scientists Say New Quantum Material Could “‘Download’ Your Brain”

A new type of quantum material can directly measure neural activity and translate it into electrical signals for a computer.

Computer Brain

Scientists say they’ve developed a new “quantum material” that could one day transfer information directly from human brains to a computer.

The research is in early stages, but it invokes ideas like uploading brains to the cloud or hooking people up to a computer to track deep health metrics — concepts that until now existed solely in science fiction.

Quantum Interface

The new quantum material, described in research published Wednesday in the journal Nature Communications, is a “nickelate lattice” that the scientists say could directly translate the brain’s electrochemical signals into electrical activity that could be interpreted by a computer.

“We can confidently say that this material is a potential pathway to building a computing device that would store and transfer memories,” Purdue University engineer Shriram Ramanathan told ScienceBlog.

Running Diagnostics

Right now, the new material can only detect the activity of some neurotransmitters — so we can’t yet upload a whole brain or anything like that. But if the tech progresses, the researchers hypothesize that it could be used to detect neurological diseases, or perhaps even store memories.

“Imagine putting an electronic device in the brain, so that when natural brain functions start deteriorating, a person could still retrieve memories from that device,” Ramanathan said.

READ MORE: New Quantum Material Could Warn Of Neurological Disease [ScienceBlog]

More on brain-computer interface: This Neural Implant Accesses Your Brain Through the Jugular Vein

The post Scientists Say New Quantum Material Could “‘Download’ Your Brain” appeared first on Futurism.

Read more:

Scientists Say New Quantum Material Could “‘Download’ Your Brain”

Quantum Computing | D-Wave Systems

Quantum Computation

Rather than store information using bits represented by 0s or 1s as conventional digital computers do, quantum computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time. This superposition of statesalong with the other quantum mechanical phenomena of entanglement and tunnelingenables quantum computers to manipulate enormous combinations of states at once.

In nature, physical systems tend to evolve toward their lowest energy state: objects slide down hills, hot things cool down, and so on. This behavior also applies to quantum systems. To imagine this, think of a traveler looking for the best solution by finding the lowest valley in the energy landscape that represents the problem.

Classical algorithms seek the lowest valley by placing the traveler at some point in the landscape and allowing that traveler to move based on local variations. While it is generally most efficient to move downhill and avoid climbing hills that are too high, such classical algorithms are prone to leading the traveler into nearby valleys that may not be the global minimum. Numerous trials are typically required, with many travelers beginning their journeys from different points.

In contrast, quantum annealing begins with the traveler simultaneously occupying many coordinates thanks to the quantum phenomenon of superposition. The probability of being at any given coordinate smoothly evolves as annealing progresses, with the probability increasing around the coordinates of deep valleys. Quantum tunneling allows the traveller to pass through hillsrather than be forced to climb themreducing the chance of becoming trapped in valleys that are not the global minimum. Quantum entanglement further improves the outcome by allowing the traveler to discover correlations between the coordinates that lead to deep valleys.

The D-Wave system has a web API with client libraries available for C/C++, Python, and MATLAB. This allows users to access the computer easily as a cloud resource over a network.

To program the system, a user maps a problem into a search for the lowest point in a vast landscape, corresponding to the best possible outcome. The quantum processing unitconsiders all the possibilities simultaneously to determine the lowest energy required to form those relationships. The solutions are values that correspond to the optimal configurations of qubits found, or the lowest points in the energy landscape. These values are returned to the user program over the network.

Because a quantum computer is probabilistic rather than deterministic, the computer returns many very good answers in a short amount of timethousands of samples in one second. This provides not only the best solution found but also other very good alternatives from which to choose.

D-Wave systems are intended to be used to complement classical computers. There are many examples of problems where a quantum computer can complement an HPC (high-performance computing) system. While the quantum computer is well suited to discrete optimization, for example,the HPC system is better at large-scale numerical simulations.

Download this whitepaper to learn more about programming a D-Wave quantum computer.

D-Waves flagship product, the 2000qubit D-Wave 2000Q quantum computer, is the most advanced quantum computer in the world. It is based on a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation. It is best suited to tackling complex optimization problems that exist across many domains such as:

Download the Technology Overview

View post:

Quantum Computing | D-Wave Systems

Russian Scientists Used a Quantum Computer to Turn Back Time

Russian physicists, armed with a quantum computer, managed to send a single electron back in time, resetting the computer to its state from a moment earlier

Fall Back

Russian scientists have apparently reversed the flow of time in an experiment they conducted on a quantum computer.

The finding is unlikely to lead to a time machine that would work on people. But the team of physicists managed to restore IBM’s public quantum computer to the state it had been in just a moment earlier, according to research published Wednesday in the journal Nature Scientific Reports — a nuanced result, but one that could have striking implications for the future of computing, quantum physics, and our understanding of time itself.

“We have artificially created a state that evolves in a direction opposite to that of the thermodynamic arrow of time,” Gordey Lesovik, a quantum physicist from the Moscow Institute of Physics and Technology who led the research project, said in a university-published press release.

 Great Scott

Lesovik’s team worked with scientists at the Argonne National Laboratory in Illinois to run thousands of experiments on a quantum system programmed to reverse time’s arrow on a single electron.

After thousands of trials, the physicists managed to restore the quantum computer’s earlier state about 85 percent of the time, but only if they were working with a simplified, two-qubit system. A more complex quantum computer with three qubits was too chaotic, and the time reversal experiment only worked 49 percent of the time.

Just like research into quantum teleportation has nothing to do with transporting people, there’s no reason to link this study to the notion of a machine that could travel through time. Rather, the scientists hope that their work can help quantum computer scientists make sure their software is actually doing what it’s supposed to by kicking it back through time and double checking its work.

READ MORE: Physicists reverse time using quantum computer [Moscow Institute of Physics and Technology newsroom via EurekAlert]

More on quantum computers: Scientists Are Building a Quantum Computer That “Acts Like a Brain”

The post Russian Scientists Used a Quantum Computer to Turn Back Time appeared first on Futurism.

Read the original post:

Russian Scientists Used a Quantum Computer to Turn Back Time

Why IBM Thinks Quantum Computers Will Boost Machine Learning

IBM figured out how to use a quantum computer to make machine learning algorithms better than ever before. But no quantum computer is good enough yet.

Quantum Leap

When full-scale quantum computers finally arrive, they could give machine learning algorithms a major boost, letting the AI systems find hidden patterns in data that today’s best technology has no hope of spotting.

At least, that’s the gist of research by IBM scientists, first shared online last year that was published in the journal Nature on Wednesday — findings that could help make much more powerful AI without requiring fundamentally-new algorithms.

Connect The Dots

This quantum AI would excel at what’s called feature mapping — breaking down the data into its core components to figure out everything about it. For example, a machine learning algorithm trained to analyze images could analyze the color of every single pixel in the image, looking for patterns that might reveal what the picture depicts.

Modern machine learning systems running on a classical computer are already pretty good at that. But according to the new research, a quantum computer could give the AI such a boost that it would be able to look for subtler patterns within huge datasets, according to an IBM-published blog.

That may mean finding new trends within troves of data from medical research or new insights into climate change. But Antonio Córcoles, an IBM quantum computing scientist, told Futurism that he has a hard time predicting how these systems may be used, since scientists aren’t aware of the patterns and discoveries that they don’t know to look for.

Stepping On Toes

But this quantum-boosted AI isn’t about to solve any scientific mysteries just yet. The IBM scientists concede that even their best quantum computer isn’t nearly sophisticated enough to outperform a classical computer at machine learning tasks.

Rather, the team figured out how such a could enhance machine learning, should the researchers working on the hardware side of the problem figure out how to catch up.

Córcoles told Futurism that he thinks quantum computer research will get there in about five years, but it will take a community effort as scientists in more fields find ways that the devices will be able to help them crack problems previously thought unsolvable.

READ MORE: Researchers Put Machine Learning on the Path to Quantum Advantage [IBM Newsroom]

More on quantum computers: Scientists Are Building a Quantum Computer That “Acts Like a Brain”

The post Why IBM Thinks Quantum Computers Will Boost Machine Learning appeared first on Futurism.

Read the original post:

Why IBM Thinks Quantum Computers Will Boost Machine Learning

Quantum technology – Wikipedia

Quantum technology is a new field of physics and engineering, which is about creating practical applications -- such as quantum computing, quantum sensors, quantum cryptography, quantum simulation, quantum metrology and quantum imaging -- based on properties of quantum mechanics, especially quantum entanglement, quantum superposition and quantum tunnelling.

Quantum superposition states can be very sensitive to a number of external effects, such as electric, magnetic and gravitational fields; rotation, acceleration and time, and therefore can be used to make very accurate sensors. There are many experimental demonstrations of quantum sensing devices, such as the experiments carried out by the Nobel laureate William D. Phillips on using cold atom interferometer systems to measure gravity and the atomic clock which is used by many national standards agencies around the world to define the second.

Recent efforts are being made to engineer quantum sensing devices, so that they are cheaper, easier to use, more portable, lighter and consume less power. It is believed that if these efforts are successful, it will lead to multiple commercial markets, such as for the monitoring of oil and gas deposits, or in construction.

Quantum secure communication are methods which are expected to be 'quantum safe' in the advent of a quantum computing systems that could break current cryptography systems. One significant component of a quantum secure communication systems is expected to be Quantum key distribution, or 'QKD': a method of transmitting information using entangled light in a way that makes any interception of the transmission obvious to the user.

Quantum computers are the ultimate quantum network, combining 'quantum bits' or 'qubit' which are devices that can store and process quantum data (as opposed to binary data) with links that can transfer quantum information between qubits. In doing this, quantum computers are predicted to calculate certain algorithms significantly faster than even the largest classical computer available today.

Quantum computers are expected to have a number of significant uses in computing fields such as optimization and machine learning. They are famous for their expected ability to carry out 'Shor's Algorithm', which can be used to factorise large numbers which are mathematically important to secure data transmission.

There are many devices available today which are fundamentally reliant on the effects of quantum mechanics. These include: laser systems, transistors and semi-conductor devices and other devices, such as MRI imagers. These devices are often referred to belonging to the 'first quantum revolution'; the UK Defence Science and Technology Laboratory (Dstl) grouped these devices as 'quantum 1.0',[1] that is devices which rely on the effects of quantum mechanics. Quantum technologies are often described as the 'second quantum revolution' or 'quantum 2.0'. These are generally regarded as a class of device that actively create, manipulate and read out quantum states of matter, often using the quantum effects of superposition and entanglement.

The field of quantum technology was first outlined in a 1997 book by Gerard J. Milburn,[2] which was then followed by a 2003 article by Jonathan P. Dowling and Gerard J. Milburn,[3][4] as well as a 2003 article by David Deutsch.[5] The field of quantum technology has benefited immensely from the influx of new ideas from the field of quantum information processing, particularly quantum computing. Disparate areas of quantum physics, such as quantum optics, atom optics, quantum electronics, and quantum nanomechanical devices, have been unified under the search for a quantum computer and given a common language, that of quantum information theory.

The Quantum Manifesto was signed by 3,400 scientists and officially released at the 2016 Quantum Europe Conference, calling for a quantum technology initiative to coordinate between academia and industry, to move quantum technologies from the laboratory to industry, and to educate quantum technology professionals in a combination of science, engineering, and business.[6][7][8][9][10]

The European Commission responded to that manifesto with the Quantum Technology Flagship [11][12], a 1 Billion, 10-year-long megaproject, similar in size to earlier European Future and Emerging Technologies Flagship projects such as the Graphene Flagship and Human Brain Project.[8][13] China is building the world's largest quantum research facility with a planned investment of 76 Billion Yuan (approx. 10 Billion).[14] The USA are preparing a national initiative.[15][16]

From 2010 onwards, multiple governments have established programmes to explore quantum technologies [17], such as the UK National Quantum Technologies Programme, which created four quantum 'hubs', the Centre for Quantum Technologies in Singapore, and QuTech a Dutch centre to develop a topological quantum computer.[18] On 22 December 2018, Donald Trump signed into law the US National Quantum Initiative Act, with a billion dollar a year budget, which is widely viewed as a response to gains in QuTech by the Chinese particularly the recent launch of the Chinese Quantum Satellite.

In the private sector, there have been multiple investments into quantum technologies made by large companies. Examples include Google's partnership with the John Martinis group at UCSB,[19] multiple partnerships with the Canadian quantum computing company D-wave systems, and investment by many UK companies within the UK quantum technologies programme.

Continue reading here:

Quantum technology - Wikipedia

MIT Quantum Computing Online Courses for Professionals

Pioneering Quantum with IBM Q

MITs quantum learning initiative is created in collaboration with IBM Q, and the MIT-IBM Watson AI Lab. The MIT-IBM Watson AI Lab is focused on fundamental artificial intelligence (AI) research with the goal of propelling scientific breakthroughs that unlock the potential of AI. A key initiative of the lab is the intersection of quantum computing and machine learning.

IBM Q, which offers commercial universal quantum computing systems for businesses and sciences, has provided underwriting to produce this course. Applications of Quantum Computing courses utilize IBM Q systems and technology, including IBM Q and open source quantum software developer kit Qiskit. MIT is solely responsible for all course content decisions.

Go here to see the original:

MIT Quantum Computing Online Courses for Professionals

IBM hits quantum computing milestone, may see ‘Quantum …

IBM is outlining another milestone in quantum computing -- its highest Quantum Volume to date -- and projects that practical uses or so called Quantum Advantage may be a decade away.

Big Blue, which will outline the scientific milestone at the American Physical Society March Meeting, made a bit of a splash at CES 2019 with a display of its Q System quantum computer and has been steadily showing progress on quantum computing.

In other words, that quantum computing buying guide for technology executives may take a while. Quantum Volume is a performance metric that indicates progress in the pursuit of Quantum Advantage. Quantum Advantage refers to the point where quantum applications deliver significant advantages to classical computers.

Also:Meet IBM's bleeding edge of quantum computingCNET

Quantum Volume is determined by the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.

IBM said its Q System One, which has a 20-qubit processor, produced a Quantum Volume of 16, double the current IBM Q, which has a Quantum Volume of 8. IBM also said the Q System One has some of the lowest error rates IBM has measured.

That progress is notable, but practical broad use cases are still years away. IBM said Quantum Volume would need to double every year to reach Quantum Advantage within the next decade. Faster progress on Quantum Advantage would speed up that timeline. IBM has doubled the power of its quantum computers annually since 2017.

Once Quantum Advantage is hit, there would be new applications, more of an ecosystem and real business use cases. Consumption of quantum computing would still likely be delivered via cloud computing since the technology has some unique characteristics that make a traditional data center look easy. IBM made its quantum computing technology available in 2016 via a cloud service and is working with partners to find business and science use cases.

Here'show quantum computing and classic computing differsvia our recent primer on the subject.

Every classical electronic computer exploits the natural behavior of electrons to produce results in accordance with Boolean logic (for any two specific input states, one certain output state). Here, the basic unit of transaction is the binary digit ("bit"), whose state is either 0 or 1. In a conventional semiconductor, these two states are represented by low and high voltage levels within transistors.

In a quantum computer, the structure is radically different. Its basic unit of registering state is the qubit, which at one level also stores a 0 or 1 state (actually 0 and/or 1). Instead of transistors, a quantum computing obtains its qubits by bombarding atoms with electrical fields at perpendicular angles to one another, the result being to line up the ions but also keep them conveniently and equivalently separated. When these ions are separated by just enough space, their orbiting electrons become the home addresses, if you will, for qubits.

Link:

IBM hits quantum computing milestone, may see 'Quantum ...

Microsofts quantum computing network takes a giant leap …

Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo)

REDMOND, Wash. Quantum computing may still be in its infancy but the Microsoft Quantum Network is all grown up, fostered by in-house developers, research affiliates and future stars of the startup world.

The network made its official debut today here at Microsofts Redmond campus, during a Startup Summit that laid out the companys vision for quantum computing and introduced network partners to Microsofts tools of the quantum trade.

Quantum computing stands in contrast to the classical computer technologies that have held sway for more than a half-century. Classical computing is based on the ones and zeroes of bit-based processing, while quantum computing takes advantage of the weird effects of quantum physics. Quantum bits, or qubits, neednt represent a one or a zero, but can represent multiple states during computation.

The quantum approach should be able to solve computational problems that cant easily be solved using classical computers, such as modeling molecular interactions or optimizing large-scale systems. That could open the way to world-changing applications, said Todd Holmdahl, corporate vice president of Microsofts Azure Hardware Systems Group.

Were looking at problems like climate change, Holmdahl said. Were looking at solving big food production problems. We think we have opportunities to solve problems around materials science, personal health care, machine learning. All of these things are possible and obtainable with a quantum computer. We have been talking around here that were at the advent of the quantum economy.

Representatives from 16 startups were invited to this weeks Startup Summit, which features talks from Holmdahl and other leaders of Microsofts quantum team as well as demos and workshops focusing on Microsofts programming tools. (The closest startup to Seattle is 1QBit, based in Vancouver, B.C.)

Over the past year and a half, Microsoft has released a new quantum-friendly programming language called Q# (Q-sharp) as part of its Quantum Development Kit, and has worked with researchers at Pacific Northwest National Laboratory and academic institutions around the world to lay the technical groundwork for the field.

A big part of that groundwork is the development ofa universal quantum computer, based on a topological architecture that builds error-correcting mechanisms right into the cryogenically cooled, nanowire-based hardware. Cutting down on the error-producing noise in quantum systems will be key to producing a workable computer.

We believe that our qubit equals about 1,000 of our competitions qubits, Holmdahl said.

Theres lots of competition in the quantum computing field nowadays: IBM, Google and Intel are all working on similar technologies for a universal quantum computer, while Canadas D-Wave Systems is taking advantage of a more limited type of computing technology known as quantum annealing.

This week, D-Wave previewed its plans for a new type of computer topology that it said would reduce quantum noise and more than double the qubit count of its existing platform, from 2,000 linked qubits to 5,000.

But the power of quantum computing shouldnt be measured merely by counting qubits. The efficiency of computation and the ability to reduce errors can make a big difference, said Microsoft principal researcher Matthias Troyer.

For example, a standard approach to simulating the molecular mechanism behind nitrogen fixation for crops could require 30,000 years of processing time, he said. But if the task is structured to enable parallel processing and enhanced error correction, the required runtime can be shrunk to less than two days.

Quantum software engineering is really as important as the hardware engineering, Troyer said.

Julie Love, director of Microsoft Quantum Business Development, said that Microsoft will start out offering quantum computing through Miicrosofts Azure cloud-based services. Not all computational problems are amenable to the quantum approach: Its much more likely that an application will switch between classical and quantum processing and therefore, between classical tools such as the C# programming language and quantum tools such as Q#.

When you work in chemistry and materials, all of these problems, you hit this known to be unsolvable problem, Love said. Quantum provides the possibility of a breakthrough.

Love shies away from giving a firm timetable for the emergence of specific applications but last year, Holmdahl predicted that commercial quantum computers would exist five years from now. (Check back in 2023 to see how the prediction panned out.)

The first applications could well focus on simulating molecular chemistry, with the aim of prototyping better pharmaceuticals, more efficient fertilizers, better batteries, more environmentally friendly chemicals for the oil and gas industry, and a new class of high-temperature superconductors. It might even be possible to address the climate change challenge by custom-designing materials that pull excess carbon dioxide out of the air.

Love said quantum computers would also be well-suited for addressing optimization problems, like figuring out how to make traffic flow better through Seattles urban core; and for reducing the training time required for AI modeling.

That list is going to continue to evolve, she said.

Whenever the subject quantum computing comes up, cryptography has to be mentioned as well. Its theoretically possible for a quantum computer to break the codes that currently protect all sorts of secure transactions, ranging from email encryption to banking protocols.

Love said those code-breaking applications are farther out than other likely applications, due to the huge amount of computation resources that would be required even for a quantum computer. Nevertheless, its not too early to be concerned. We have a pretty significant research thrust in whats called post-quantum crypto, she said.

Next-generation data security is one of the hot topics addressed $1.2 billion National Quantum Initiative that was approved by Congress and the White House last December. Love said Microsofts post-quantum crypto protocols have already gone through an initial round of vetting by the National Institute of Standards and Technology.

Weve been working at this in a really open way, she said.

Like every technology, quantum computing is sure to have a dark side as well as a bright side. But its reassuring to know that developers are thinking ahead about both sides.

Excerpt from:

Microsofts quantum computing network takes a giant leap ...


12345...