Page 132«..1020..131132133134..»

Category Archives: Quantum Computing

D-Wave partners with U of T to move quantum computing along – Financial Post

Posted: June 1, 2017 at 11:09 pm

Not even the greatest geniuses in the world could explain quantum computing.

In the early 1930s Einstein, in fact, called quantum mechanics the basis for quantum computing spooky action at a distance.

Then theres a famous phrase from the late Nobel Laureate in physics, Richard Feynman: If you think you understand quantum mechanics, then you dont understand quantum mechanics.

That may be so, but the mystery behind quantum has not stopped D-Wave Systems Inc. from making its mark in the field. In the 1980s it was thought maybe quantum mechanics could be used to build a computer. So people starting coming up with ideas on how to build one, says Bo Ewald, president of D-Wave in Burnaby, B.C.

Two of those people were UBC PhD physics grads Eric Ladizinsky and Geordie Rose, who had happened to take an entrepreneur course before founding D-Wave in 1999. Since there werent a lot of businesses in the field, they created and collected patents around quantum, Ewald says.

What we have with D-Wave is the mother of all ships: that is the hardware capability to unlock the future of AI

While most who were exploring the concept were looking in the direction of what is called the universal gate model, D-Wave decided to work on a different architecture, called annealing. The two do not necessarily compete, but perform different functions.

In quantum annealing, algorithms quickly search over a space to find a minimum (or solution). The technology is best suited for speeding research, modelling or traffic optimization for example.

Universal gate quantum computing can put basic quantum circuit operations together to create any sequence to run increasingly complex algorithms. (Theres a third model, called topological quantum computing, but it could be decades before it can be commercialized.)

When D-Wave sold its first commercial product to Lockheed Martin about six years ago, it marked the first commercial sale of a quantum computer, Ewald says. Google was the second to partner with D-Wave for a system that is also being run by NASA Ames Research Center. Each gets half of the machine, Ewald says. They believed quantum computing had an important future in machine learning.

Most recently D-Wave has been working with Volkswagen to study traffic congestion in Beijing. They wanted to see if quantum computing would have applicability to their business, where there are lots of optimization problems. Another recent coup is a deal with the Los Alamos National Laboratory.

Theres no question that any quantum computing investment is a long-term prospect, but that has not hindered their funding efforts. To date, the company has acquired more than 10 rounds of funding from the likes of PSP, Goldman Sachs, Bezos Expeditions, DFJ, In-Q-Tel, BDC Capital, GrowthWorks, Harris & Harris Group, International Investment and Underwriting, and Kensington Partners Ltd.

What we have with D-Wave is the mother of all ships: that is the hardware capability to unlock the future of AI, says Jrme Nycz, executive vice-president, BDC Capital. We believe D-Waves quantum capabilities have put Canada on the map.

Now, Ewing says, the key for the company moving forward is getting more smart people working on apps and on software tools in the areas of AI, machine earning and deep learning.

To that end, D-Wave recently not only open-sourced its Qbsolv software tool, it launched an initiative with Creative Destruction Lab at the University of Torontos Rotman School of Management to create a new track focused on quantum machine learning. The intensive one-year program will go through an introductory boot camp led by Dr. Peter Wittek, author of Quantum Machine Learning: What Quantum Computing means to Data Mining, with instruction and technical support from D-Wave experts, and access to a D-Wave technology.

While it is still early days in terms of deployment for quantum computing, Ewald believes D-Waves early start gives them a leg up if and when quantum hits the mainstream. So far customers tend to be government and/or research related. Google is the notable exception. But once apps come along that are applicable for other industries, it will all make sense.

The early start has given D-Wave the experience to be able to adopt other architectures as they evolve. It may be a decade before a universal gate model machine becomes a marketable product. If that turns out to be true, we will have a 10-year lead in getting actual machines into the field and having customers working on and developing apps.

Ewald is the first to admit that as an early entrant, D-Wave faces criticism around its architecture. There are a lot of spears and things that we tend to get in the chest. But we see them coming and can deal with it. If we can survive all that, we will have a better view of the market, real customers and relationships with accelerators like Creative Destruction Lab. At the end of day we will have the ability to adapt when we need to.

Read the original post:

D-Wave partners with U of T to move quantum computing along - Financial Post

Posted in Quantum Computing | Comments Off on D-Wave partners with U of T to move quantum computing along – Financial Post

Tektronix AWG Pulls Test into Era of Quantum Computing – Electronic Design

Posted: at 11:09 pm

When a company calls and says they have the best widget ever, you have to be skeptical. However, you also cant help but be curious. When they talked about how it would advance the state of the art in radar, electronic warfare, and quantum-computing test, and make an engineers workspace tidier, I was smitten.

I met up with theTektronix team, led by Product Market Manager Kip Pettigrew, and wasnt disappointed: The new AWG5200 arbitrary waveform generator is a work of art and function. Physically, its both commanding and imposing. It measures 18.13 6.05 from the front, but its 23.76 inches deepso, while itll sit nicely within a test stack and help reduce clutter, the stack had better have a deep shelf (Figs. 1 and 2).

Its whats within those dimensions, and what you have to pay to get it, though, that give the AWG5200 a certain level of gravitas. For sure, its hard to ignore a price point of $82,000, but its not surprising when you understand what youre getting in return.

1. The AWG5200 measures 18.13 6.05 and comes with a 6.5-inch touchscreen, a removable hard drive (upper right), and two, four, or eight channels (bottom right). (Source: Tektronix)

Aimed squarely at military/government and advanced research applications, the system emphasizes signal fidelity, scalability, and flexibility. It can accurately reproduce complex, real-world signals across an ever-expanding array of applications without having to physically expand a test area. Its also supported by Tektronixs SourceXpress software, which lets you create waveforms and control the AWGs remotely, and has a growing library of waveform-creation plugins.

2. The AWG5200 is designed to be compact so that it can stack easily with other equipment to reduce overall space requirements, though it is 23.76 inches deep. A synchronization feature allows it to scale up beyond eight channels by adding more AWG5200s. (Source: Tektronix)

Let the Specs Tell the Story

Digging into the specs uncovers what the AWG5200 is all about. Words like powerful, precision, and solid engineering come to mind. The system can sample at 5 Gsamples/s (10-Gsamples/s with interpolation) with 16-bit vertical resolution across two, four, or eight channels per unit. Channel-to-channel skew (typical) is <25 ps with a range of 2 ns and a resolution of 0.5 ps. The analog bandwidth is 2 GHz at 3 dB) or 4 GHz at 6 dB, and the amplitude range is 100 to 0.75 V p-p, with an accuracy of 2% of setting.

The AWG5200s multi-unit synchronization feature helps scale up beyond eight channels. Note that each channel is independent, so the classic tradeoff of sample memory for bandwidth doesnt apply here. Each channel gets 2 Gsamples of waveform memory.

The precision is embodied within its ability to generate RF signals with a spurious-free dynamic range (SFDR) of 70 dBc. Combined with a software suite and support, this is critical as new waveforms and digital-modulation techniques are explored in a time of rapid wireless evolution in military and government applications, as well as 5G and even quantum-computer test. Signal fidelity isnt something you want to worry about, and the expanding library and customizable features help kickstart and then fine-tune your research and development waveforms.

Howd They Do That?

Achieving higher or improved specifications is almost always a labor of love: The test companys engineers constant urge to make things better combines with customer feedback and an analysis of where to focus energy and development to have the most impact. However, at a fundamental level, the AWG5200s advances go back to the digital-to-analog converter (DAC) technology at the heart of the system.

Advances in DAC technologies, particularly with respect to signal processing and functional integration, allow them to directly generate detailed and complex RF and electronic-warfare (EW) signals. This is an area worth digging into in more detail, so Christopher Skach and Sahandi Noorizadeh developed a feature specially for Electronic Design on DAC technology advances and how its changing signal generation for test. Its worth a look.

Rapidly Evolving Applications

Pettigrew also provided a quick run through of the newer and more interesting applications, as well as the key market trends that the system is solving for. In general electronic test, go wide technologies like MIMO need test systems that can scale as they need multiple, independent, wide-bandwidth RF streams (Fig. 3).

3. Rapid expansion in the use of techniques such as MIMO requires more advanced and flexible waveform generators to generate multiple high-fidelity, RF signals with complex modulation schemes. (Source: Tektronix)

This translates over to mil/gov, too, where systems must be tested for their ability to detect and respond to adaptive threats. The signals of interest are able to be generated on two channels, while the others can be used to generate expected noise, Wi-Fi interferers, and other MIMO channels.

However, just being able to reproduce the signals isnt enough: The AWG must be capable of enabling stress and margin testing, as well as verification and characterization.1

On the research front, it turns out that quantum computing needs advanced AWGs, too, said Pettigrew, as they lack the fidelity, latency, and scalability. In quantum computers, the qubits are often controlled using precision-pulsed microwave signals, each requiring multiple independent RF channels. This is only going to get more interesting and challenging as companies like IBM and Google, along with many independent physicists and engineers, work to scale up quantum-computing technology and applications.

For all three of these applications, cost remains a factor. So, instead of developing multiple custom solutions, the AWG5200 may be a good commercial off-the-shelf (COTS) option.

References:

1. How New DAC Technologies are Changing Signal Generation for Test

Continued here:

Tektronix AWG Pulls Test into Era of Quantum Computing - Electronic Design

Posted in Quantum Computing | Comments Off on Tektronix AWG Pulls Test into Era of Quantum Computing – Electronic Design

Telstra just wants a quantum computer to offer as-a-service – ZDNet

Posted: at 11:09 pm

Due to the changes needed to algorithms and computational thinking, Telstra chief scientist Hugh Bradlow believes the first commercial users of quantum computers will need some help adjusting -- and the Australian incumbent telco will be there to offer that help at a price.

"I can assure you they are not going to walk in on day one and know how to use these things," Bradlow said on Wednesday evening.

"We want to be able to offer it as-a-service to them ... they will need a lot of hand holding, and they are not going to run the equipment themselves, it's complicated."

Telstra and the Commonwealth Bank of Australia (CBA) are two of the companies backing the work of a team at the University of New South Wales (UNSW) that is looking to develop quantum computing in silicon.

At the end of 2015, both companies contributed AU$10 million over five years to UNSW.

Despite racing against far greater funded rivals, head of UNSW's quantum effort professor Michelle Simmons said she is happy with the funding the Centre of Excellence for Quantum Computation and Communication Technology has received.

"At the moment, you have to prove you have the best hardware of anything out there to know whether you are going to go further or not," Simmons said. "I guess one of the things we've been very much driven by is milestone-based research.

"Can we actually develop the qubits, qubit by qubit, and prove that they are better than other qubits that out there? And so if you have lots of money in the beginning, and you are not doing that systematic thorough approach, it's actually not that helpful to you. You have to do it, proving it along the way."

Simmons said her team is currently looking at producing a 10-qubit system by the end of the decade, and, if successful, will be looking to move up to 100 qubits.

In October last year, the UNSW team announced that they had created a new qubit that remains in a stable superposition for 10 times longer than previously achieved.

A year earlier, the team built the first 2-qubit logic gate in silicon.

"The prototype chip we want to make within five years is a pretty shrinkable manufacturing process, and it will be able to perform a variety of calculations; we hope it will be able to potentially solve the problem that currently can't be solved on an existing computer," Andrew Dzurak, scientia professor at the university, said at the time.

"That particular type of problem may not be the sort of problem that is going to excite many commercial people in the first instance, but it will be an important principal."

Even though UNSW is at the frontier of quantum computing, however, Bradlow said Telstra just wants to get its hands on one.

"We are agnostic at the end of the day; we just want a quantum computer," he said. "We do hope Michelle's team wins ... we've gone and put our money on it because we think it's got the best odds, so it's not just a random bet, but we are obviously keeping across anything that is out there.

"Over the last year and a half, I've probably visited every major group in the world, and they all have very different views and by seeing multiple views you get a much better perspective.

"So it's important to keep across everything."

For its part, CBA is preparing for a quantum future by using a quantum computing simulator from QxBranch.

"The difference between the emulator of a quantum computer and the real hardware is that we run the simulator on classical computers, so we don't get the benefit of the speed up that you get from quantum, but we can simulate its behaviour and some of the broad characteristics of what the eventual hardware will do," QxBranch CEO Michael Brett told ZDNet in April.

"What we provide is the ability for people to explore and validate the applications of quantum computing so that as soon as the hardware is ready, they'll be able to apply those applications and get the benefit immediately of the unique advantages of quantum computing."

View post:

Telstra just wants a quantum computer to offer as-a-service - ZDNet

Posted in Quantum Computing | Comments Off on Telstra just wants a quantum computer to offer as-a-service – ZDNet

Toward mass-producible quantum computers | MIT News – MIT News

Posted: at 11:09 pm

Quantum computers are experimental devices that offer large speedups on some computational problems. One promising approach to building them involves harnessing nanometer-scale atomic defects in diamond materials.

But practical, diamond-based quantum computing devices will require the ability to position those defects at precise locations in complex diamond structures, where the defects can function as qubits, the basic units of information in quantum computing. In todays of Nature Communications, a team of researchers from MIT, Harvard University, and Sandia National Laboratories reports a new technique for creating targeted defects, which is simpler and more precise than its predecessors.

In experiments, the defects produced by the technique were, on average, within 50 nanometers of their ideal locations.

The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it, says Dirk Englund, an associate professor of electrical engineering and computer science who led the MIT team. Were almost there with this. These emitters are almost perfect.

The new paper has 15 co-authors. Seven are from MIT, including Englund and first author Tim Schrder, who was a postdoc in Englunds lab when the work was done and is now an assistant professor at the University of Copenhagens Niels Bohr Institute. Edward Bielejec led the Sandia team, and physics professor Mikhail Lukin led the Harvard team.

Appealing defects

Quantum computers, which are still largely hypothetical, exploit the phenomenon of quantum superposition, or the counterintuitive ability of small particles to inhabit contradictory physical states at the same time. An electron, for instance, can be said to be in more than one location simultaneously, or to have both of two opposed magnetic orientations.

Where a bit in a conventional computer can represent zero or one, a qubit, or quantum bit, can represent zero, one, or both at the same time. Its the ability of strings of qubits to, in some sense, simultaneously explore multiple solutions to a problem that promises computational speedups.

Diamond-defect qubits result from the combination of vacancies, which are locations in the diamonds crystal lattice where there should be a carbon atom but there isnt one, and dopants, which are atoms of materials other than carbon that have found their way into the lattice. Together, the dopant and the vacancy create a dopant-vacancy center, which has free electrons associated with it. The electrons magnetic orientation, or spin, which can be in superposition, constitutes the qubit.

A perennial problem in the design of quantum computers is how to read information out of qubits. Diamond defects present a simple solution, because they are natural light emitters. In fact, the light particles emitted by diamond defects can preserve the superposition of the qubits, so they could move quantum information between quantum computing devices.

Silicon switch

The most-studied diamond defect is the nitrogen-vacancy center, which can maintain superposition longer than any other candidate qubit. But it emits light in a relatively broad spectrum of frequencies, which can lead to inaccuracies in the measurements on which quantum computing relies.

In their new paper, the MIT, Harvard, and Sandia researchers instead use silicon-vacancy centers, which emit light in a very narrow band of frequencies. They dont naturally maintain superposition as well, but theory suggests that cooling them down to temperatures in the millikelvin range fractions of a degree above absolute zero could solve that problem. (Nitrogen-vacancy-center qubits require cooling to a relatively balmy 4 kelvins.)

To be readable, however, the signals from light-emitting qubits have to be amplified, and it has to be possible to direct them and recombine them to perform computations. Thats why the ability to precisely locate defects is important: Its easier to etch optical circuits into a diamond and then insert the defects in the right places than to create defects at random and then try to construct optical circuits around them.

In the process described in the new paper, the MIT and Harvard researchers first planed a synthetic diamond down until it was only 200 nanometers thick. Then they etched optical cavities into the diamonds surface. These increase the brightness of the light emitted by the defects (while shortening the emission times).

Then they sent the diamond to the Sandia team, who have customized a commercial device called the Nano-Implanter to eject streams of silicon ions. The Sandia researchers fired 20 to 30 silicon ions into each of the optical cavities in the diamond and sent it back to Cambridge.

Mobile vacancies

At this point, only about 2 percent of the cavities had associated silicon-vacancy centers. But the MIT and Harvard researchers have also developed processes for blasting the diamond with beams of electrons to produce more vacancies, and then heating the diamond to about 1,000 degrees Celsius, which causes the vacancies to move around the crystal lattice so they can bond with silicon atoms.

After the researchers had subjected the diamond to these two processes, the yield had increased tenfold, to 20 percent. In principle, repetitions of the processes should increase the yield of silicon vacancy centers still further.

When the researchers analyzed the locations of the silicon-vacancy centers, they found that they were within about 50 nanometers of their optimal positions at the edge of the cavity. That translated to emitted light that was about 85 to 90 percent as bright as it could be, which is still very good.

Its an excellent result, says Jelena Vuckovic, a professor of electrical engineering at Stanford University who studies nanophotonics and quantum optics. I hope the technique can be improved beyond 50 nanometers, because 50-nanometer misalignment would degrade the strength of the light-matter interaction. But this is an important step in that direction. And 50-nanometer precision is certainly better than not controlling position at all, which is what we are normally doing in these experiments, where we start with randomly positioned emitters and then make resonators.

Originally posted here:

Toward mass-producible quantum computers | MIT News - MIT News

Posted in Quantum Computing | Comments Off on Toward mass-producible quantum computers | MIT News – MIT News

Purdue, Microsoft Partner On Quantum Computing Research | WBAA – WBAA

Posted: at 11:09 pm

Purdue researchers are partnering with Microsoft and scientists at three other universities around the globe to determine whether theyve found a way to create a stable form of whats known as quantum computing.

A new five-year agreement aims to build a type of system that could perform computations that are currently impossible in a short timespan, even for supercomputers.

Purdue physics and astronomy professor Michael Manfra is heading up the West Lafayette team, which will work with Microsoft scientists and university colleagues in Australia, the Netherlands and Denmark to construct, manipulate and strengthen tiny building blocks of information called topological qubits."

The real win that topological quantum computing suggests is that if you devise your system in which you store your information cleverly enough, that you can make the qubit insensitive basically deaf to the noise thats all around it in the environment, Manfra says.

He says that deafness is important because of whats held quantum computing back the ease with which its disturbed.

It can interact with photons; electromagnetic fields. It can interact with vibrations of the lattice. And those interactions, what they can do is cause a decoherence of that qubit basically cause it to lose the stored information.

Manfra says its an open question whether quantum computing will ever overtake the current zeroes-and-ones system of information storing, but he says hes interested in either proving or disproving the concept.

Read the original here:

Purdue, Microsoft Partner On Quantum Computing Research | WBAA - WBAA

Posted in Quantum Computing | Comments Off on Purdue, Microsoft Partner On Quantum Computing Research | WBAA – WBAA

AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals – Futurism

Posted: May 30, 2017 at 3:04 pm

In BriefMajor companies like IBM are turning to artificialintelligence and quantum computing to protect against cyberattacks. While these technologies aren't silver bullets, they areessential tools for cyber security in the age of the Internet ofThings. Weapons of Cyber Warfare

Cyber security has become a key issue in our national and international discussions. No longer do cyber attacks concern only email companies and individuals who are unwilling to update their tech. Now, cyber crime has had a major impact on both U.S. mainstream political parties, and almost any organization even hospitals should have some concern about the possibility of an attack through a computer network.

In their struggle to fight cyber crime, major companies like IBM are turning to two of the worlds most powerful technologies artificial intelligence (AI) and quantum computing.

IBMs AI, Watson,helps human analysts sift through the200,000 or so security events the company has to deal with on a day-to-day basis. It helps determine which events dont require special attention, such as instances when an employee forgets their password, and which should receive more scrutiny.

Before artificial intelligence, wed have to assume that a lot of the data say 90 percent is fine. We only would have bandwidth to analyze this 10 percent, Daniel Driver from Chemring Technology Solutions, a provider of electric warfare, said in an interview with the Financial Times.The AI mimics what an analyst would do, how they look at dataIts doing a huge amount of legwork upfront, which means we can focus our analysts time.

Watson is about 60 times faster that its human counterparts, and speed is key for defending against cyber attacks. But even Watsons impressive rates pale in comparison to those that can be attained with quantum computers.

The analogy we like to use is that of a needle in a haystack, Driver said in the interview. A machine can be specially made to look for a needle in a haystack, but it still has to look under every piece of hay. Quantum computing means, Im going to look under every piece of hay simultaneously and find the needle immediately.

While these technologies are not silver bullets against cyber attacks, they are becoming vital tools in the cyber security industry, which is projected to grow from $74 billion last year to $100 billion in 2020. Part of this growth may be attributed to societys increasing reliance on the Internet of Things (IoT). As everything from light bulbs to our jackets become digitally accessible, every person should be more concerned about cyber security.

As we continue to see advancements in both AI and quantum computing technologies, more businesses and households will have access to these protective tools. AIs are already finding their places in different sectors of our society including healthcare. Perhaps in addition to diagnosing medical images, AIs can also protect hospitals from future cyber attacks.

Read more here:

AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals - Futurism

Posted in Quantum Computing | Comments Off on AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals – Futurism

For more advanced computing, technology needs to make a … – CIO Dive

Posted: at 3:04 pm

Traditional binary data can be only a 0 or a 1 at any given time. But imagine if data could be both a 0 and a 1 at the same time?

It sounds simple, but such a change would enable exponentially faster computation. Add specialized hardware, and users of this new type of computing power could perform faster analytics and predictions, empowering advances in a broad range of areas including cybersecurity, fraud detection and early disease detection, to name a few.

It may sound like science fiction, but its actually all quite possible with quantum computing. Quantum computing has the potential to tackle large mathematical problems supercomputers cant even touch. And while its long been limited to science fiction, quantum computing is becoming reality, and quite quickly.

The primary hurdle scientists are working to overcome today has to do with the scalability of quantum, which is limited by the extreme cooling required to keep quantum bits (qubits) stable and to keep the equipment required to read and write quantum data working. Currently, qubits must be cooled to cryogenic temperatures to preserve quantum states.

But getting the cooling issue and other challenges around quantum computing figured out could come with huge rewards. Quantum computing has the potential to unleash promising,data-laden applications that include artificial intelligence, Big Data and the Internet of Things.

Though companies like Google, IBM and Intel are developing new semiconductor chips specifically to handle high-speed machine learning, quantum computing has the potential to take things to a whole new level.

Its still in its infancy, but quantum science is attracting big investments, and those investments have the potential to push quantum forward more quickly.

"A growing number of enterprises are already committing resources to exploring how to apply quantum computing,"said David Schatsky, managing director at Deloitte. "The stakes appear to be too high to ignore this still-nascent technology."

Quantum attracted $147 million in venture capital in the last three years alone, and $2.2 billion in government funding globally, according to a Deloitte analysis, based on CB Insights data. The advanced computing is no longer confined to academic research labs and start-up companies. Big tech companies are also placing bets that quantum will soon drive innovation across industries.

Google recently announced that it plans to produce a viable quantum computer in the next five years, and startups Rigetti and D-Wave as well as established players like Microsoft and Intel have been investing in quantum too. Intel recently invested $50 million to help develop the technology.

Until there is a breakthrough to make quantum computers viable onsite and allow companies to fully build quantum computing into their operations, some companies are working to offer Quantum as a Service.

Earlier this year, IBM announced it wants to start providing enterprise partners with access to quantum computing systems as soon as this year. Dubbed "IBM Q," the quantum systems and services will be delivered via the IBM Cloud platform and are "designed to tackle problems that are too complex and exponential in nature for classical computing systems to handle," according to the announcement.

IBM also announced the release of a new API that will allow developers and programmers to build interfaces between its cloud-based quantum computer and classical computers. Big Blue plans to release a quantum-related Software Development Kit later this year.

Once it becomes reality, there are many ways quantum computing or quantum services could be employed in the enterprise. Schatsky recently examined how quantum computing could impact a broad range of industries ranging from finance to life sciences to manufacturing as part of his "Signals for Strategists"blog on Deloitte University Press.

For example, Schatsky points out that financial institutions such as Barclays and Goldman Sachs are investigating the use of quantum computing in areas such as "portfolio optimization, asset pricing, capital project budgeting and data security." Other organizations are exploring applications in logistics, aerospace, industrial chemistry and energy.

"For instance, the standard process for manufacturing fertilizer uses some 2% to 5% of global natural gas production each year," wrote Schatsky. "Quantum simulation could lead to the discovery of a more efficient process that could save billions of dollars and trillions of cubic feet of natural gas annually."

Of course, quantum computing also presents risks because all that power could potentially be used for malicious purposes.Imagine a cyberattack using quantum computing, for example.

"When quantum computing becomes a reality, many public-key algorithms will become obsolete," said Kevin Curran, an IEEE senior member and senior lecturer in Computer Science at the University of Ulster.

Fortunately, Curran says at the same time scientists are working to make quantum computing a reality, cryptographers are creating new algorithms to prepare for a time when quantum computing could pose a threat.

"Technologies such as quantum key distribution will provide us with a means to communicate securely, while post-quantum cryptography will ensure that our encrypted data remains safe, even during brute-force attacks by a quantum computer,"said Curran. "But the threat that quantum computing poses is to the security of public key algorithms. Most symmetric cryptographic algorithms (symmetric ciphers and hash functions) are believed to be relatively secure against attacks by quantum computers."

See the original post:

For more advanced computing, technology needs to make a ... - CIO Dive

Posted in Quantum Computing | Comments Off on For more advanced computing, technology needs to make a … – CIO Dive

Microsoft, Purdue Extend Quantum Computing Partnership To Create More Stable Qubits – Tom’s Hardware

Posted: at 3:04 pm

Purdue University announced that its partnership with Microsoft on quantum computing projects has been extended by several years. The collaboration is supposed to help both the college and the company bring quantum computers out of the laboratory and into the real world.

It's safe to say that quantum computing is something of an obsession for top universities and businesses alike. Just look at some of the stories from the last few months: Stanford University is researching materials that could make quantum computing more feasible, Google is trying to bring a quantum computer to market within the next few years, and IBM recently leapfrogged the competition by revealing 16- and 17-qubit computers. (A qubit is the quantum equivalent to a bit--the main difference is that qubits aren't binary; they can have a value of 0, 1, or both.)

Microsoft is (clearly) also interested in quantum computing. Besides the partnership with Purdue, the company recently told reporters gathered at its Redmond campus that its researchers are working hard to make a quantum computer. But the company doesn't want to go it alone, and with this extension of its partnership with Purdue, it's reaffirmed its desire to be part of the race to quantum computing. Here's what Microsoft researcher Michael Freedman said about quantum computing in today's announcement:

There is another computing planet out there, and we, collectively, are going to land on it. It really is like the old days of physical exploration and much more interesting than locking oneself in a bottle and traveling through space. We will find an amazing unseen world once we have general purpose programmable quantum computers.

Freedman also said that Purdue generally, and Michael Manfra specifically, will be "a key collaborator on this journey." Manfra's full title is as follows: Purdue University's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, Professor of Materials Engineering and Professor of Electrical and Computer Engineering. (We're pretty sure that title alone is longer than some college students' papers.) Freedman said that Manfra and his team's work on materials silence and transport physics will help Microsoft "build the systems we will use to do quantum computing."

The partnership will specifically focus on creating a "topological qubit." We'll let Manfra explain that bit:

"One of the challenges in quantum computing is that the qubits interact with their environment and lose their quantum information before computations can be completed," Manfra says. "Topological quantum computing utilizes qubits that store information 'non-locally' and the outside noise sources have less effect on the qubit, so we expect it to be more robust."

The idea, then, is to make quantum computing more stable. Microsoft actually said that was its plan in November 2016, shortly after the University of New South Wales revealed that it had created "dressed qubits" that were 10-times more stable than their predecessors. Chances are good that Purdue wants to best that figure--academia is a cutthroat world--and Microsoft can quickly benefit from those efforts. A stable qubit is a useful qubit; an unstable one isn't likely to help Microsoft lead us to quantum computing like a technological Christopher Columbus.

See the original post here:

Microsoft, Purdue Extend Quantum Computing Partnership To Create More Stable Qubits - Tom's Hardware

Posted in Quantum Computing | Comments Off on Microsoft, Purdue Extend Quantum Computing Partnership To Create More Stable Qubits – Tom’s Hardware

Doped Diamonds Push Practical Quantum Computing Closer to Reality – Motherboard

Posted: May 28, 2017 at 8:17 am

A large team of researchers from MIT, Harvard University, and Sandia National Laboratories has scored a major advance toward building practical quantum computers. The work, which is described in the current Nature Communications, offers a new pathway toward using diamonds as the foundation for optical circuitscomputer chips based on manipulating light rather than electric current, basically.

Pushing beyond the quantum computing hype and, perhaps, misinformation, we're still faced with a largely theoretical technology. Engineering a real quantum computer is hard because it should be hard. What we're attempting to do is harness a highly strange and even more so fragile property of the quantum world, which is the ability of particles to occupy seemingly contradictory physical states: up and down, left and right, is and isn't.

If we could just have that property in the same sense that we can have a basic electronic component like a transistor, we'd be set. But maintaining and manipulating qubits, the units of information consisting of simultaneous contradictory particle states, is really hard. Just looking at a quantum system means disrupting it, and, if that system happened to be encoding information, the information is lost.

The almost-perfect lattice structure of atoms in a diamond offers a promising foundation for a quantum circuit. Here, a qubit is stored within a "defect" within the diamond. Every so often within the neatly ordered confines of a diamond, an atom will be missing. In this vacancy, another atom might sneak in to replace the missing carbon atom. This diamond defect may in turn have some free electrons associated with it, and it's among these particles that information is stored (while information is transmitted around the diamond as photons, or light particles).

Crucially, this little swarm of electrons naturally emits light particles that are able to mirror the quantum superposition (the particle or particle system in multiple states). This is then a way of retrieving information from the qubit without disturbing it.

The challenge is in finding and implementing the ideal replacement for the carbon atom in the diamond lattice. This replacement is known as a dopant. This is where the new study comes in.

The most-studied dopant for diamond-defect optical circuits is nitrogen. It's stable enough to maintain the requisite quantum superposition, but is limited in the frequencies of light that it can emit. It's like having a perfect encryption system that can nonetheless only represent like a quarter of the alphabet.

The dopant explored in the new research is silicon. Silicon atoms embedded into a diamond lattice are able to emit much narrower wavelength bands. It's like they have a higher-resolution. But the cost of being able represent information with more precision are more precarious quantum states. Consequently, the diamonds have to be kept at very near absolute-zero temperature. Nitrogen states, meanwhile, can withstand heat up to about four degrees above absolute zero. In either case, we're not exactly talking about quantum laptops.

The researchers were able to implant silicon defects into diamonds via a two-step process involving first blasting the diamond with a laser to create vacancies and then heating the diamond way up to the point that the vacancies start to move around the lattice and bond with silicon atoms. The result is a lattice with an impressively large number of embedded silicon atoms that are exactly where they should be within the structure.

The result is a promising pathway toward reliable fabrication of "efficient lightmatter interfaces based on semiconductor defects coupled to nanophotonic devices." The stuff of a quantum computer, in other words.

See the original post here:

Doped Diamonds Push Practical Quantum Computing Closer to Reality - Motherboard

Posted in Quantum Computing | Comments Off on Doped Diamonds Push Practical Quantum Computing Closer to Reality – Motherboard

IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud – All About Circuits

Posted: at 8:17 am

IBM has created a 17-qubit quantum computer and is making plans to timeshare the machine with other companies via cloud computing. While this is an important step, it isn't quite enough to make quantum computers truly competitive compared to supercomputers. What will it take to bring quantum computing into the commercial realmand how long until we get there?

Classical computing has been around for many years and has completely transformed the human race. Near instant communication between any two individuals used to be a dream. The idea of large calculations being done faster than you can blink was unimaginable. The concept of free information and education was too much for any University to handle.

But it comes as no surprise that, now that these concepts are areality,we've become dependent on them. This dependence places pressure on the industry to produce more powerful devices with every passing year. This was not an issue in the past since silicon devices were easy to scale down. But, with transistor gates as small as one-atom thick, shrinking may no longer be possible. Silicon, the building block of modern semiconductors, is already being phased out by Intel and future devices using feature sizes of 7nm and smaller will instead be made from materials such as Indium-Gallium-Arsenide (InGaAs).

One solution for increasing computational power is the use of quantum computers (though theircreation isn'tlikely to allow faster consumer devices). A common applicationis reliant on control flow, discrete mathematics, and IO handling. A quantum computer, however, is designed to solve statistical problems and scenarios which involve large amounts of data. The best way to understand it is to compare a classical processor (such as an i7) to an imaginary quantum processor (iQ7 for example). The i7 could add 1000 numbers together much faster than the q7, but the q7 could solve a game (such as checkers) much faster than the i7 due to the possible number of moves that the game possesses.

So why are quantum computers so good at parallel data crunching?

A classical computer is made up of transistors which handle two possible states:on (1) and off (0). For each additional bit, the amount of information that can be represented is equal to 2n where n is the number of bits. For example, four bits can represent oneof 16 possible states and eight bits can represent oneof 256 possible states.

By comparison, a quantum bitorqubitcan hold three states: on (1), off (0), and a superposition state. While the on and off states behave in an identical manner to classical bits, the superposition is what drives quantum computation. This superposition is a linear probability that lies between 0 and 1, allowing four qubits to represent all 16 different states at the same time where each one of those 16 states has a complex amplitude reflecting its probability of being observed.

Read More

So it's pretty obvious that quantum computing provides many advantages over classical computers for complex, parallel data processing. While such tasks are not commonly found in the everyday device, they are almost too common in many different industries, including financial data processing, insurance, scientific models, oil reserves, and research. Currently, supercomputers are used for such parallel data processing but, if a quantum option were available, it's a safe bet that each of these sectors would do anything to get one.

This has been one of the major drives in quantum computer technology with many companies trying to produce such a machine. For example, D-Wave Systems have their series of specialized quantum annealing processors, while many other researchers and companies are trying to find methods of producing universal quantum gates.

However, IBM has just taken the lead with their 17-qubit quantum computer.

What makes the IBM quantum computer a game changer is that it is a universal quantum computer as opposed to being a highly specialized device. Many other quantum systems currently available are usually of the annealing persuasion, which is good for optimization problems but not for other quantum problems such as database searches. The IBM machine, however, can be configured to execute just about any quantum problem.

IBM has decided to sell time on the computers to business and researchers alike through their IBM Q program accessed via the internet (i.e., over the cloud). This will allow developers and researchers to create a quantum program anywhere around the world and then have it executed with the press of a button.

IBM's made strides with its previous 5-qubit quantum computer. This 17-qubit machine is obviously yet another milestone. However, many say that even a 17-qubit computer is not good enough because classical computers can still process the same information in a smaller time frame. In fact, it has been stated that classical computers can model quantum computers up to 50 qubits in size. This means that, for a quantum computer to become better at solving quantum related problems than a classical computer, it has to contain at least 50 qubits. Of course, this assumes that such quantum computer simulations on classical computers do not improve.

SoGoogle is ambitiously planning to release a 49-qubit quantum computer by the end of this year. Considering the size difference between the IBM machine and the proposed Google machine, however, it's likely safe to assume that Googles machine may not be entirely universal.

It's safe to say that quantum computers, despite becoming increasingly more powerful, are still very far away from being commercially available. IBM's cloud-based scheme, however, does technically place quantum computing into the commercial realm.

Supercomputers are still very powerful compared to quantum computers and their cost-to-performance ratio makes them highly economical. But, unlike fusion power (which is always 20 years away), quantum computers really could make their debut when either IBM or Google release the world's first 50-qubit computer.

Here is the original post:

IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud - All About Circuits

Posted in Quantum Computing | Comments Off on IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud – All About Circuits

Page 132«..1020..131132133134..»