Can Quantum Computing Be the New Buzzword – Analytics Insight

Quantum Mechanics created their chapter in the history of the early 20th Century. With its regular binary computing twin going out of style, quantum mechanics led quantum computing to be the new belle of the ball! While the memory used in a classical computer encodes binary bits one and zero, quantum computers use qubits (quantum bits). And Qubit is not confined to a two-state solution, but can also exist in superposition i.e., qubits can be employed at 0, 1 and both 1 and 0 at the same time.

Hence it can perform many calculations in parallel owing to the ability to pursue simultaneous probabilities through superposition along with manipulating them with magnetic fields. Its coefficients allow predicting how much zero-ness and one-ness it has, are complex numbers, which indicates the real and imaginary part. This provides a huge technical edge over other conventional computing. The beauty of this is if you have n qubits, you can have a superposition of 2n states or bits of information simultaneously.

Another magic up its sleeve is that Qubits are capable of pairing which is referred to as entanglement. Here, the state of one qubit cannot be described independently of the state of the others which allows instantaneous communication.

To quote American theoretical physicist, John Wheeler, If you are not completely confused by quantum mechanics, you do not understand it. So, without a doubt it is safe to say that even quantum computing has few pitfalls. First, the qubits tend to loss the information they contain, and also lose their entanglement in other words, decoherence. Second, imperfections of quantum rotations. These led to a loss of information within a few microsecond.

Ultimately, quantum computing is the Trump Card as promises to be a disruptive technology with such dramatic speed improvements. This will enable systems to solve complex higher-order mathematical problems that earlier took months to be computed, investigate material properties, design new ones, study superconductivity, aid in drug discovery via simulation and understanding new chemical reactions.

This quantum shift in the history of computer sciences can also pave way for encrypted communication (as keys cannot be copied nor hacked), much better than Blockchain technology, provide improved designs for solar panels, predict financial markets, big data mining, develop Artificial Intelligence to new heights, enhanced meteorological updates and a much-anticipated age of quantum internet. According to scientists, Future advancements can also lead to help find a cure for Alzheimers.

The ownership and effective employment of a quantum computer could change the political and technological dynamics of the world. Computing power, in the end, is power whether it is personal, national or globally strategic. In short, a quantum computer could be an existential threat to a nation that hasnt got one. At the moment Google, IBM, Intel, and D-Wave are pursuing this technology. While there are scientific minds who dont believe in the potential of quantum computing yet unless you are a time-traveler like Marty McFly in Back to the Future series or any one of the Doctor Who, one cannot say what future beholds.

Read the original post:

Can Quantum Computing Be the New Buzzword - Analytics Insight

D-Wave makes its quantum computers free to anyone working on the coronavirus crisis – VentureBeat

D-Wave today made its quantum computers available for free to researchers and developers working on responses to the coronavirus (COVID-19) crisis. D-Wave partners and customers Cineca, Denso, Forschungszentrum Jlich, Kyocera, MDR, Menten AI, NEC, OTI Lumionics, QAR Lab at LMU Munich, Sigma-i, Tohoku University, and Volkswagen are also offering to help. They will provide access to their engineering teams with expertise on how to use quantum computers, formulate problems, and develop solutions.

Quantum computing leverages qubits to perform computations that would be much more difficult, or simply not feasible, for a classical computer. Based in Burnaby, Canada, D-Wave was the first company to sell commercial quantum computers, which are built to use quantum annealing. D-Wave says the move to make access free is a response to a cross-industry request from the Canadian government for solutions to the COVID-19 pandemic. Free and unlimited commercial contract-level access to D-Waves quantum computers is available in 35 countries across North America, Europe, and Asia via Leap, the companys quantum cloud service. Just last month, D-Wave debuted Leap 2, which includes a hybrid solver service and solves problems of up to 10,000 variables.

D-Wave and its partners are hoping the free access to quantum processing resources and quantum expertise will help uncover solutions to the COVID-19 crisis. We asked the company if there were any specific use cases it is expecting to bear fruit. D-Wave listed analyzing new methods of diagnosis, modeling the spread of the virus, supply distribution, and pharmaceutical combinations. D-Wave CEO Alan Baratz added a few more to the list.

The D-Wave system, by design, is particularly well-suited to solve a broad range of optimization problems, some of which could be relevant in the context of the COVID-19 pandemic, Baratz told VentureBeat. Potential applications that could benefit from hybrid quantum/classical computing include drug discovery and interactions, epidemiological modeling, hospital logistics optimization, medical device and supply manufacturing optimization, and beyond.

Earlier this month, Murray Thom, D-Waves VP of software and cloud services, told us quantum computing and machine learning are extremely well matched. In todays press release, Prof. Dr. Kristel Michielsen from the Jlich Supercomputing Centre seemed to suggest a similar notion: To make efficient use of D-Waves optimization and AI capabilities, we are integrating the system into our modular HPC environment.

Read the original:

D-Wave makes its quantum computers free to anyone working on the coronavirus crisis - VentureBeat

Q-CTRL to Host Live Demos of ‘Quantum Control’ Tools – Quantaneo, the Quantum Computing Source

Q-CTRL, a startup that applies the principles of control engineering to accelerate the development of the first useful quantum computers, will host a series of online demonstrations of new quantum control tools designed to enhance the efficiency and stability of quantum computing hardware.

Dr. Michael Hush, Head of Quantum Science and Engineering at Q-CTRL, will provide an overview of the companys cloud-based quantum control engineering software called BOULDER OPAL. This software uses custom machine learning algorithms to create error-robust logical operations in quantum computers. The team will demonstrate - using real quantum computing hardware in real time - how they reduce susceptibility to error by 100X and improve hardware stability in time by 10X, while reducing time-to-solution by 10X against existing software.

Scheduled to accommodate the global quantum computing research base, the demonstrations will take place:

April 16 from 4-4:30 p.m. U.S. Eastern Time (ET) April 21 from 10-10:30 a.m. Singapore Time (SGT) April 23 from 10-10:30 a.m. Central European Summer Time (CEST) To register, visit https://go.q-ctrl.com/l/791783/2020-03-19/dk83

Released in Beta by Q-CTRL in March, BOULDER OPAL is an advanced Python-based toolkit for developers and R&D teams using quantum control in their hardware or theoretical research. Technology agnostic and with major computational grunt delivered seamlessly via the cloud, BOULDER OPAL enables a range of essential tasks which improve the performance of quantum computing and quantum sensing hardware. This includes the efficient identification of sources of noise and error, calculating detailed error budgets in real lab environments, creating new error-robust logic operations for even the most complex quantum circuits, and integrating outputs directly into real hardware.

The result for users is greater performance from todays quantum computing hardware, without the need to become an expert in quantum control engineering.

Experimental validations and an overview of the software architecture, developed in collaboration with the University of Sydney, were recently released in an online technical manuscript titled Software Tools for Quantum Control: Improving Quantum Computer Performance through Noise and Error Suppression.

Go here to read the rest:

Q-CTRL to Host Live Demos of 'Quantum Control' Tools - Quantaneo, the Quantum Computing Source

We’re Getting Closer to the Quantum Internet, But What Is It? – HowStuffWorks


Back in February 2020, scientists from the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago revealed that they had achieved a quantum entanglement in which the behavior of a pair two tiny particles becomes linked, so that their states are identical over a 52-mile (83.7 kilometer) quantum-loop network in the Chicago suburbs.

You may be wondering what all the fuss is about, if you're not a scientist familiar with quantum mechanics that is, the behavior of matter and energy at the smallest scale of reality, which is peculiarly different from the world we can see around us.

But the researchers' feat could be an important step in the development of a new, vastly more powerful version of the internet in the next few decades. Instead of the bits that today's network uses, which can only express a value of either 0 or 1, the future quantum internet would utilize qubits of quantum information, which can take on an infinite number of values. (A quibit is the unit of information for a quantum computer; it's like a bit in an ordinary computer).

That would give the quantum internet way more bandwidth, which would make it possible to connect super-powerful quantum computers and other devices and run massive applications that simply aren't possible with the internet we have now.

"A quantum internet will be the platform of a quantum ecosystem, where computers, networks, and sensors exchange information in a fundamentally new manner where sensing, communication, and computing literally work together as one entity, " explains David Awschalom via email. He's a spintronics and quantum information professor in the Pritzker School of Molecular Engineering at the University of Chicago and a senior scientist at Argonne, who led the quantum-loop project.

So why do we need this and what does it do? For starters, the quantum internet is not a replacement of the regular internet we now have. Rather it would be a complement to it or a branch of it. It would be able to take care of some of the problems that plague the current internet. For instance, a quantum internet would offer much greater protection from hackers and cybercriminals. Right now, if Alice in New York sends a message to Bob in California over the internet, that message travels in more or less a straight line from one coast to the other. Along the way, the signals that transmit the message degrade; repeaters read the signals, amplify and correct the errors. But this process allows hackers to "break in" and intercept the message.

However, a quantum message wouldn't have that problem. Quantum networks use particles of light photons to send messages which are not vulnerable to cyberattacks. Instead of encrypting a message using mathematical complexity, says Ray Newell, a researcher at Los Alamos National Laboratory, we would rely upon the peculiar rules of quantum physics. With quantum information, "you can't copy it or cut it in half, and you can't even look at it without changing it." In fact, just trying to intercept a message destroys the message, as Wired magazine noted. That would enable encryption that would be vastly more secure than anything available today.

"The easiest way to understand the concept of the quantum internet is through the concept of quantum teleportation," Sumeet Khatri, a researcher at Louisiana State University in Baton Rouge, says in an email. He and colleagues have written a paper about the feasibility of a space-based quantum internet, in which satellites would continually broadcast entangled photons down to Earth's surface, as this Technology Review article describes.

"Quantum teleportation is unlike what a non-scientist's mind might conjure up in terms of what they see in sci-fi movies, " Khatri says. "In quantum teleportation, two people who want to communicate share a pair of quantum particles that are entangled. Then, through a sequence of operations, the sender can send any quantum information to the receiver (although it can't be done faster than light speed, a common misconception). This collection of shared entanglement between pairs of people all over the world essentially constitutes the quantum internet. The central research question is how best to distribute these entangled pairs to people distributed all over the world. "

Once it's possible to do that on a large scale, the quantum internet would be so astonishingly fast that far-flung clocks could be synchronized about a thousand times more precisely than the best atomic clocks available today, as Cosmos magazine details. That would make GPS navigation vastly more precise than it is today, and map Earth's gravitational field in such detail that scientists could spot the ripple of gravitational waves. It also could make it possible to teleport photons from distant visible-light telescopes all over Earth and link them into a giant virtual observatory.

"You could potentially see planets around other stars, " says Nicholas Peters, group leader of the Quantum Information Science Group at Oak Ridge National Laboratory.

It also would be possible for networks of super-powerful quantum computers across the globe to work together and create incredibly complex simulations. That might enable researchers to better understand the behavior of molecules and proteins, for example, and to develop and test new medications.

It also might help physicists to solve some of the longstanding mysteries of reality. "We don't have a complete picture of how the universe works," says Newell. "We have a very good understanding of how quantum mechanics works, but not a very clear picture of the implications. The picture is blurry where quantum mechanics intersects with our lived experience."

But before any of that can happen, researchers have to figure out how to build a quantum internet, and given the weirdness of quantum mechanics, that's not going to be easy. "In the classical world you can encode information and save it and it doesn't decay, " Peters says. "In the quantum world, you encode information and it starts to decay almost immediately. "

Another problem is that because the amount of energy that corresponds to quantum information is really low, it's difficult to keep it from interacting with the outside world. Today, "in many cases, quantum systems only work at very low temperatures," Newell says. "Another alternative is to work in a vacuum and pump all the air out. "

In order to make a quantum internet function, Newell says, we'll need all sorts of hardware that hasn't been developed yet. So it's hard to say at this point exactly when a quantum internet would be up and running, though one Chinese scientist has envisioned that it could happen as soon as 2030.

Read more here:

We're Getting Closer to the Quantum Internet, But What Is It? - HowStuffWorks

Devs: Alex Garland on Tech Company Cults, Quantum Computing, and Determinism – Den of Geek UK

Yet that difference between the common things a company can sell and the uncommon things they quietly develop is profoundly important. In Devs, the friendly exterior of Amaya with its enormous statue of a childa literal monument to Forests lost daughteris a public face to the actual profound work his Devs team is doing in a separate, highly secretive facility. Seemingly based in part on mysterious research and development wings of tech giantsthink Googles moonshot organizations at X Development and DeepMindDevs is using quantum computing to change the world, all while keeping Forests Zen ambition as its shield.

I think it helps, actually, Garland says about Forest not being a genius. Because I think what happens is that these [CEO] guys present as a kind of front between what the company is doing and the rest of the world, including the kind of inspection that the rest of the world might want on the company if they knew what the company was doing. So our belief and enthusiasm in the leader stops us from looking too hard at what the people behind-the-scenes are doing. And from my point of view thats quite common.

A lifelong man of words, Garland describes himself as a writer with a laymans interest in science. Yet its fair to say he studies almost obsessively whatever field of science hes writing about, which now pertains to quantum computing. A still largely unexplored frontier in the tech world, quantum computing is the use of technology to apply quantum-mechanical phenomena to data a traditional computer could never process. Its still so unknown that Google AI and NASA published a paper only six months ago in which they claimed to have achieved quantum supremacy (the creation of a quantum device that can actually solve problems a classical computer cannot).

Whereas binary computers work with gates that are either a one or a zero, a quantum qubit [a basic unit of measurement] can deal with a one and a zero concurrently, and all points in between, says Garland. So you get a staggering amount of exponential power as you start to run those qubits in tandem with each other. What the filmmaker is especially fascinated by is using a quantum system to model another quantum system. That is to say using a quantum computer with true supremacy to solve other theoretical problems in quantum physics. If we use a binary way of doing that, youre essentially using a filing system to model something that is emphatically not binary.

So in Devs, quantum computing is a gateway into a hell of a trippy concept: a quantum computer so powerful that it can analyze the theoretical data of everything that has or will occur. In essence, Forest and his team are creating a time machine that can project through a probabilistic system how events happened in the past, will happen in the future, and are happening right now. It thus acts as an omnipotent surveillance system far beyond any neocons dreams.

Read the rest here:

Devs: Alex Garland on Tech Company Cults, Quantum Computing, and Determinism - Den of Geek UK

Who Will Mine Cryptocurrency in the Future – Quantum Computers or the Human Body? – Coin Idol

Apr 01, 2020 at 09:31 // News

Companies including Microsoft, IBM and Google, race to come up with cheap and effective mining solutions to improve its cost and energy efficiency. Lots of fuss has been made around quantum computing and its potential for mining. Now, the time has come for a new solution - mining with the help of human body activity.

While quantum computers are said to be able to hack bitcoin mining algorithms, using physical activity for the process is quite a new and extraordinary thing. The question is, which technology turns out to be more efficient?

Currently, with the traditional cryptocurrency mining methods, the reward for mining a bitcoin block is around 12.5 bitcoins, at $4k per BTC and this should quickly be paid off after mining a few blocks.

Consequently, the best mining method as per now is to keep trying random numbers and wait to observe which one hashes to a number that isnt more than the target difficulty. And this is one of the reasons as to why mining pools have arisen where multiple PCs are functioning in parallel to look for the proper solution to the problem and if one of the PCs gets the solution, then the pool is given an appropriate reward which is then shared among all the miners.

Quantum computers possess more capacity and might potentially be able to significantly speed up mining while eliminating the need for numerous machines. Thus, it can improve both energy efficiency and the speed of mining.

In late 2019, Google released a quantum processor called Sycamore, many times faster than the existing supercomputer. There was even a post in the medium claiming that this new processor is able to mine all remaining bitcoins like in two seconds. Sometime later the post was deleted due to an error in calculations, according to the Bitcoinist news outlet.

Despite quantum computing having the potential to increase the efficiency of mining, its cost is close to stratospheric. It would probably take time before someone is able to afford it.

Meanwhile, another global tech giant, Microsoft, offers a completely new and extraordinary solution - to mine cryptos using a persons brain waves or body temperature. As coinidol.com, a world blockchain news outlet has reported, they have filed a patent for a groundbreaking system which can mine digital currencies using the data collected from human beings when they view ads or do exercises.

The IT giant disclosed that sensors could identify and diagnose any activity connected with the particular piece(s) of work like the time taken to read advertisements, and modify it into digital information that is readable by a computing device to do computation works, the same manner as a conventional proof-of-work (PoW) system works. Some tasks would either decrease or soar computational energy in an appropriate manner, basing on the produced amount of info from the users activity.

So far, there is no signal showing when Microsoft will start developing the system and it is still uncertain whether or not this system will be developed on its own blockchain network. Quantum computing also needs time to be fully developed and deployed.

However, both solutions bear a significant potential for transforming the entire mining industry. While quantum computing is able to boost the existing mining mechanism, having eliminated high energy-consuming mining firms, Microsofts new initiative can disrupt the industry making it even look different.

Which of these two solutions turns out to be more viable? We will see over time. What do you think about these mining solutions? Let us know in the comments below!

Read more:

Who Will Mine Cryptocurrency in the Future - Quantum Computers or the Human Body? - Coin Idol

The Schizophrenic World Of Quantum Interpretations – Forbes

Quantum Interpretations

To the average person, most quantum theories sound strange, while others seem downright bizarre.There are many diverse theories that try to explain the intricacies of quantum systems and how our interactions affect them.And, not surprisingly, each approach is supported by its group of well-qualified and well-respected scientists.Here, well take a look at the two most popular quantum interpretations.

Does it seem reasonable that you can alter a quantum system just by looking at it? What about creating multiple universes by merely making a decision?Or what if your mind split because you measured a quantum system?

You might be surprised that all or some of these things might routinely happen millions of times every day without you even realizing it.

But before your brain gets twisted into a knot, lets cover a little history and a few quantum basics.

The birth of quantum mechanics

Classical physics describes how large objects behave and how they interact with the physical world.On the other hand, quantum theory is all about the extraordinary and inexplicable interaction of small particles on the invisible scale of such things as atoms, electrons, and photons.

Max Planck, a German theoretical physicist, first introduced the quantum theory in 1900. It was an innovation that won him the Nobel Prize in physics in 1918.Between 1925 and 1930, several scientists worked to clarify and understand quantum theory.Among the scientists were Werner Heisenberg and Erwin Schrdinger, both of whom mathematically expanded quantum mechanics to accommodate experimental findings that couldnt be explained by standard physics.

Heisenberg, along with Max Born and Pascual Jordan, created a formulation of quantum mechanics called matrix mechanics. This concept interpreted the physical properties of particles as matrices that evolved in time.A few months later, Erwin Schrdinger created his famous wave mechanics.

Although Heisenberg and Schrdinger worked independently from each other, and although their theories were very different in presentation, both theories were essentially mathematically the same. Of the two formulations, Schrdingers was more popular than Heisenbergs because it boiled down to familiar differential equations.

While today's physicists still use these formulations, they still debate their actual meaning.

First weirdness

A good place to start is Schrdingers equation.

Erwin Schrdingers equation provides a mathematical description of all possible locations and characteristics of a quantum system as it changes over time.This description is called the systems wave function.According to the most common quantum theory, everything has a wave function. The quantum system could be a particle, such as an electron or a photon, or even something larger.

Schrdingers equation won't tell you the exact location of a particle.It only reveals the probability of finding the particle at a given location.The probability of a particle being in many places or in many states at the same time is called its superposition. Superposition is one of the elements of quantum computing that makes it so powerful.

Almost everyone has heard about Schrdingers cat in a box.Simplistically, ignoring the radiation gadgets, while the cat is in the closed box, it is in a superposition of being both dead and alive at the same time.Opening the box causes the cat's wave function to collapse into one of two states and you'll find the cat either alive or dead.

There is little dispute among the quantum community that Schrdingers equation accurately reflects how a quantum wave function evolves.However, the wave function itself, as well as the cause and consequences of its collapse, are all subjects of debate.

David Deutsch is a brilliant British quantum physicist at the University of Cambridge. In his book, The Fabric of Reality, he said: Being able to predict things or to describe them, however accurately, is not at all the same thing as understanding them. Facts cannot be understood just by being summarized in a formula, any more than being listed on paper or committed to memory.

The Copenhagen interpretation

Quantum theories use the term "interpretation" for two reasons.One, it is not always obvious what a particular theory means without some form of translation.And, two, we are not sure we understand what goes on between a wave functions starting point and where it ends up.

There are many quantum interpretations.The most popular is the Copenhagen interpretation, a namesake of where Werner Heisenberg andNiels Bohr developed their quantum theory.

Werner Heisenberg (left) with Niels Bohr at a Conference in Copenhagen in 1934.

Bohr believed that the wave function of a quantum system contained all possible quantum states.However, when the system was observed or measured, its wave function collapsed into a single state.

Whats unique about the Copenhagen interpretation is that it makes the outside observer responsible for the wave functions ultimate fate. Almost magically, a quantum system, with all its possible states and probabilities, has no connection to the physical world until an observer interacts or measures the system. The measurement causes the wave function to collapse into one of its many states.

You might wonder what happens to all the other quantum states present in the wave function as described by the Copenhagen Interpretation before it collapsed?There is no explanation of that mystery in the Copenhagen interpretation. However, there is a quantum interpretation that provides an answer to that question.Its called the Many-Worlds Interpretation or MWI.

Billions of you?

Because the many-worlds interpretation is one of the strangest quantum theories, it has become central to the plot of many science fiction novels and movies.At one time, MWI was an outlier with the quantum community, but many leading physicists now believe it is the only theory that is consistent with quantum behavior.

The MWI originated in a Princeton doctoral thesis written by a young physicist named Hugh Everett in the late 1950s. Even though Everett derived his theory using sound quantum fundamentals, it was severely criticized and ridiculed by most of the quantum community. Even Everetts academic adviser at Princeton, John Wheeler, tried to distance himself from his student. Everette became despondent over the harsh criticism. He eventually left quantum research to work for the government as a mathematician.

The theory proposes that the universe has a single, large wave function that follows Schrdingers equation.Unlike the Copenhagen Interpretation, the MWI universal wave function doesnt collapse.

Everything in the universe is quantum, including ourselves. As we interact with parts of the universe, we become entangled with it.As the universal wave function evolves, some of our superposition states decohere. When that happens, our reality becomes separated from the other possible outcomes associated with that event. Just to be clear, the universe doesn't split and create a new universe. The probability of all realities, or universes, already exists in the universal wave function, all occupying the same space-time.

Schrdinger's Cat, many-worlds interpretation, with universe branching. Visualization of the ... [+] separation of the universe due to two superposed and entangled quantum mechanical states.

In the Copenhagen interpretation, by opening the box containing Schrdingers cat, you cause the wave function to collapse into one of its possible states, either alive or dead.

In the Many -Worlds interpretation, the wave function doesn't collapse. Instead, all probabilities are realized.In one universe, you see the cat alive, and in another universe the cat will be dead.

Right or wrong decisions become right and wrong decisions

Decisions are also events that trigger the separation of multiple universes. We make thousands of big and little choices every day. Have you ever wondered what your life would be like had you made different decisions over the years?

According to the Many-Worlds interpretation, you and all those unrealized decisions exist in different universes because all possible outcomes exist in the universal wave function.For every decision you make, at least two of "you" evolve on the other side of that decision. One universe exists for the choice you make, and one universe for the choice you didnt make.

If the Many-Worlds Interpretation is correct, then right now, a near infinite versions of you are living different and independent lives in their own universes.Moreover, each of the universes overlay each other and occupy the same space and time.

It is also likely that you are currently living in a branch universe spun off from a decision made by a previous version of yourself, perhaps millions or billions of previous iterations ago.You have all the old memories of your pre-decision self, but as you move forward in your own universe, you live independently and create your unique and new memories.

A Reality Check

Which interpretation is correct?Copenhagen or Many-Worlds?Maybe neither. But because quantum mechanics is so strange, perhaps both are correct.It is also possible that a valid interpretation is yet to be expressed. In the end, correct or not, quantum interpretations are just plain fun to think about.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Disclosure: Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including Amazon.com, Advanced Micro Devices,Apstra,ARM Holdings, Aruba Networks, AWS, A-10 Strategies,Bitfusion,Cisco Systems, Dell, DellEMC, Dell Technologies, Diablo Technologies, Digital Optics,Dreamchain, Echelon, Ericsson, Foxconn, Frame, Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries,Google,HPInc., Hewlett Packard Enterprise, HuaweiTechnologies,IBM, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MACOM (Applied Micro),MapBox,Mavenir, Mesosphere,Microsoft,National Instruments, NetApp, NOKIA, Nortek,NVIDIA, ON Semiconductor, ONUG, OpenStack Foundation, Panasas,Peraso, Pixelworks, Plume Design,Portworx, Pure Storage,Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat, Samsung Electronics, Silver Peak, SONY,Springpath, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse,TensTorrent,TobiiTechnology, Twitter, Unity Technologies, Verizon Communications,Vidyo, Wave Computing,Wellsmith, Xilinx, Zebra, which may be cited in this article.

Originally posted here:

The Schizophrenic World Of Quantum Interpretations - Forbes

Disrupt The Datacenter With Orchestration – The Next Platform

Since 1965, the computer industry has relied on Moores Law to accelerate innovation, pushing more transistors into integrated circuits to improve computation performance. Making transistors smaller helped lift all boats for the entire industry and enable new applications. At some point, we will reach a physical limit that is, a limit stemming from physics itself. Even with this setback, improvements kept on pace thanks to increased parallelism of computation and consolidation of specialized functions into single chip packages, such as systems on chip).

In recent years, we are nearing another peak. This article proposes to improve computation performance not only by building better hardware, but by changing how we use existing hardware. More specifically, the focusing on how we use existing processor types. I call this approach Compute Orchestration: automatic optimization of machine code to best use the modern datacenter hardware (again, with special emphasis on different processor types).

So what is compute orchestration? It is the embracing of hardware diversity to support software.

There are many types of processors: Microprocessors in small devices, general purpose CPUs in computers and servers, GPUs for graphics and compute, and programmable hardware like FPGAs. In recent years, specialized processors like TPUs and neuromorphic processors for machine learning are rapidly entering the datacenter.

There is potential in this variety: Instead of statically utilizing each processor for pre-defined functions, we can use existing processors as a swarm, each processor working on the most suitable workloads. Doing that, we can potentially deliver more computation bandwidth with less power, lower latency and lower total cost of ownership).

Non-standard utilization of existing processors is already happening: GPUs, for example, were already adapted from processors dedicated to graphics into a core enterprise component. Today, GPUs are used for machine learning and cryptocurrency mining, for example.

I call the technology to utilize the processors as a swarm Compute Orchestration. Its tenets can be described in four simple bullets:

Compute orchestration is, in short, automatic adaptation of binary code and automatic allocation to the most suitable processor types available. I split the evolution of compute orchestration into four generations:

Compute Orchestration Gen 1: Static Allocation To Specialized Co-Processors

This type of compute orchestration is everywhere. Most devices today include co-processors to offload some specialized work from the CPU. Usually, the toolchain or runtime environment takes care of assigning workloads to the co-processor. This is seamless to the developer, but also limited in functionality.

Best known example is the use of cryptographic co-processors for relevant functions. Being liberal in our definitions of co-processor, Memory Management Units (MMUs) to manage virtual memory address translation can also be considered an example.

Compute Orchestration Gen 2: Static Allocation, Heterogeneous Hardware

This is where we are at now. In the second generation, the software relies on libraries, dedicated run time environments and VMs to best use the available hardware. Lets call the collection of components that help better use the hardware frameworks. Current frameworks implement specific code to better use specific processors. Most prevalent are frameworks that know how to utilize GPUs in the cloud. Usually, better allocation to bare metal hosts remains the responsibility of the developer. For example, the developer/DevOps engineer needs to make sure a machine with GPU is available for the relevant microservice. This phenomenon is what brought me to think of Compute Orchestration in the first place, as it proves there is more slack in our current hardware.

Common frameworks like OpenCL allow programming compute kernels to run on different processors. TensorFlow allows assigning nodes in a computation graph to different processors (devices).

This better use of hardware by using existing frameworks is great. However, I believe there is a bigger edge. Existing frameworks still require effort from the developer to be optimal they rely on the developer. Also, no legacy code from 2016 (for example) is ever going to utilize a modern datacenter GPU cluster. My view is that by developing automated and dynamic frameworks, that adapt to the hardware and workload, we can achieve another leap.

Compute Orchestration Gen 3: Dynamic Allocation To Heterogeneous Hardware

Computation can take an example from the storage industry: Products for better utilization and reliability of storage hardware have innovated for years. Storage startups develop abstraction layers and special filesystems that improve efficiency and reliability of existing storage hardware. Computation, on the other hand, remains a stupid allocation of hardware resources. Smart allocation of computation workloads to hardware could result in better performance and efficiency for big data centers (for example hyperscalers like cloud providers). The infrastructure for such allocation is here, with current data center designs pushing to more resource disaggregation, introduction of diverse accelerators, and increased work on automatic acceleration (for example: Workload-aware Automatic Parallelization for Multi-GPU DNN Training).

For high level resource management, we already have automatic allocation. For example, project Mesos (paper) focusing on fine-grained resource sharing, Slurm for cluster management, and several extensions using Kubernetes operators.

To further advance from here would require two steps: automatic mapping of available processors (which we call the compute environment) and workload adaptation. Imagine a situation where the developer doesnt have to optimize her code to the hardware. Rather, the runtime environment identifies the available processing hardware and automatically optimizes the code. Cloud environments are heterogeneous and changing, and the code should change accordingly (in fact its not the code, but the execution model in the run time environment of the machine code).

Compute Orchestration Gen 4: Automatic Allocation To Dynamic Hardware

A thought, even a possibility, can shatter and transform us. Friedrich Wilhelm Nietzsche

The quote above is to say that there we are far from practical implementation of the concept described here (as far as I know). We can, however, imagine a technology that dynamically re-designs a data center to serve needs of running applications. This change in the way whole data centers meet computation needs as already started. FGPAs are used more often and appear in new places (FPGAs in hosts, FPGA machines in AWS, SmartNICs), providing the framework for constant reconfiguration of hardware.

To illustrate the idea, I will use an example: Microsoft initiated project Catapult, augmenting CPUs with an interconnected and configurable compute layer composed of programmable silicon. The timeline in the projects website is fascinating. The project started off in 2010, aiming to improve search queries by using FPGAs. Quickly, it proposed the use of FPGAs as bumps in the wire, adding computation in new areas of the data path. Project Catapult also designed an architecture for using FPGAs as a distributed resource pool serving all the data center. Then, the project spun off Project BrainWave, utilizing FPGAs for accelerating AI/ML workloads.

This was just an example of innovation in how we compute. Quick online search will bring up several academic works on the topic. All we need to reach the 4th generation is some idea synthesis, combining a few concepts together:

Low effort HDL generation (for example Merlin compiler, BORPH)

In essence, what I am proposing is to optimize computation by adding an abstraction layer that:

Automatic allocation on agile hardware is the recipe for best utilizing existing resources: faster, greener, cheaper.

The trends and ideas mentioned in this article can lead to many places. It is very likely, that we are already working with existing hardware in the optimal way. It is my belief that we are in the midst of the improvement curve. In recent years, we had increased innovation in basic hardware building blocks, new processors for example, but we still have room to improve in overall allocation and utilization. The more we deploy new processors in the field, the more slack we have in our hardware stack. New concepts, like edge computing and resource disaggregation, bring new opportunities for optimizing legacy code by smarter execution. To achieve that, legacy code cant be expected to be refactored. Developers and DevOps engineers cant be expected to optimize for the cloud configuration. We just need to execute code in a smarter way and that is the essence of compute orchestration.

The conceptual framework described in this article should be further explored. We first need to find the killer app (what type of software we optimize to which type of hardware). From there, we can generalize. I was recently asked in a round table what is the next generation of computation? Quantum computing? Tensor Processor Units? I responded that all of the above, but what we really need is better usage of the existing generation.

Guy Harpak is the head of technology at Mercedes-Benz Research & Devcelopment in its Tel Aviv, Israel facility. Please feel free to contact him on any thoughts on the topics above at harpakguy@gmail.com. Harpak notes that this contributed article reflects his personal opinion and is in no way related to people or companies that he works with or for.

Related Reading: If you find this article interesting, I would recommend researching the following topics:

Some interesting articles on similar topics:

Return Of The Runtimes: Rethinking The Language Runtime System For The Cloud 3.0 Era

The Deep Learning Revolution And Its Implications For Computer Architecture And Chip Design (by Jeffrey Dean from Google Research)

Beyond SmartNICs: Towards A Fully Programmable Cloud

Hyperscale Cloud: Reimagining Datacenters From Hardware To Applications

Read more:

Disrupt The Datacenter With Orchestration - The Next Platform

1000 Words or So About The New QuantumAI Scam – TechTheLead

To most of us, Elon Musk is the real-life embodiment of Tony Stark. He started from nothing more or less and now is a leader in some aspects of the tech world. He started small, with an archive of newspapers and magazines, and in no time he was in Space. Space X that is.

So what is quantum computing? In a nutshell, if you network all the PCs on the planet right now, and put them to work, the resulting power would not be sufficient to run the complex calculation that a quantum computer can permutate.

Now, Elon has decided to notably withdraw from operating Tesla and SpaceX, and move to the next big chapter in his life: Quantum Computing, a venture that has seen investments over 2 billion dollars in just the prior to these years.

On the other hand, Elon never announced anything on Twitter, and that made us wonder. He left SpaceX and Tesla for this? It must be a scam! And it was. One that wants data and personal info.

On top of that if some untrusted sources are to be believed, the project is LIVE right now,beating companies like Microsoft and IBM to the punchand delivering QuantumAI. Or how Elon puts it: A new way to redistributing the worlds wealth.

The scammers strongly believe that the 1% that controls 90% of the worlds financial capital can share and help normal people to grow in wealth by using quantum computing. This theory has been thrown around even in the times ofMoore, and it now seems to be a reality for everybody on Earth.

This time the greedy scammers have raised the bar. The scam group used real footage of Elon Musk talking about his companies, but they overimposed another audio, making sure to turn people to the fake QuantumAI investment platform and automated trading app.

The group responsible for masterminding this charade are part of a bigger affiliate network, and they specialize in social media advertising like Facebook and Twitter. These networks are operating in cooperation with rogue offshore brokers who are paying referral money for investing clients. You are the investing client in this case!

And the rabbit hole goes deeper. When you sign in, you are signed up for your broker, in this case, Crypto Kartal owned by Elmond Enterprise Ltd. A company that is located in St. Vincent and the Grenadines as well as an office in Estonia where it is named Fukazawa Partnership OU. The QuantumAI scam is particularly shoddy because it practices an aggregate of two highly effective baiting systems: social media and video manipulation. And Facebook and Twitter are disseminating the message right now in some regions.

According to the scam, this iteration of QuantumAI hopes to make people 2 3 times wealthier, and no one, except the super-wealthy, will take a hit.

How do they do that? Well, the process is simpler than you can imagine. The wealthy keep their investments in bonds and stocks that they trade for a profit on the open market.

Here is the part where QuantumAI makes a power move that can affect the super-rich. The scam promises to beat Wall Street traders to the market, making winning trades before the brokers can react or intercept transactions. And with a quantum computer, you can do that! Well, as long as you have a working quantum computer that is!

Sounds super interesting right? But its a scam! All you need to do is do a Google back search of the pictures on the website and maybe, try to find out if the brokers invested in this enterprise had any scam or alerts in the past. Doubt anything, back search anything before you input any of your data, and on top of it all NEVER use your main email and password. Its safer to use a program or create a new one, just to be safe.

Be careful in these times. The Quantum AI Scam software, app, and fraudulent crypto trading platform by Elon Musk is completely blacklisted. But Facebook runs the adds with no remorse, the scammers switching between fake Guardian or CNN articles. So be aware!

See the article here:

1000 Words or So About The New QuantumAI Scam - TechTheLead

Quantum Computing strikes technology partnership with Splunk – Proactive Investors USA & Canada

Initial efforts with San Franciscos Splunk will focus on three key challenges: network security, dynamic logistics and scheduling

Quantum Computing Inc (OTC:QUBT), an advanced technology company developing quantum-ready applications and tools, said Tuesday that it has struck a technology alliance partnership with ().

San Francisco, California-based Splunk creates software for searching, monitoring, and analyzing machine-generated big data via a web-style interface.

Meanwhile, staffed by experts in mathematics, quantum physics, supercomputing, financing and cryptography, Leesburg, Virginia-based Quantum Computing is developing an array of applications to allow companies to exploit the power of quantum computing to their advantage. It is a leader in the development of quantum ready software with deep experience developing applications and tools for early quantum computers.

Splunk brings a leading big-data-analytics platform to the partnership, notably existing capabilities in its Machine/Deep Learning Toolkit in current use by Splunk customers, said the company.

Implementation of quantum computing applications will be significantly accelerated by tools that allow the development and execution of applications independent of any particular quantum computing architecture.

We are excited about this partnership opportunity, said Quantum Computing CEO Robert Liscouski. Splunk is a proven technology leader with over 17,500 customers world-wide, that has the potential to provide great opportunities for QCIs quantum ready software technologies.

Both the companies will partner to do fundamental and applied research and develop analytics that exploit conventional large-data cybersecurity stores and data-analytics workflows, combined with quantum-ready graph and constrained-optimization algorithms.

The company explained that these algorithms will initially be developed using Quantums Mukai software platform, which enables quantum-ready algorithms to execute on classical hardware and also to run without modification on quantum computing hardware when ready.

Once proofs of concept are completed, QCI and Splunk will develop new analytics with these algorithms in the Splunk data-analytics platform, to evaluate quantum analytics readiness on real-world data, noted the company.

The Splunk platform/toolkits help customers address challenging analytical problems via neural nets or custom algorithms, extensible to Deep Learning frameworks through an open source approach that incorporates existing and custom libraries.

The initial efforts of our partnership with Splunk will focus on three key challenges: network security, dynamic logistics and scheduling, said Quantum Computing.

Contact the author Uttara Choudhury at[emailprotected]

Follow her onTwitter:@UttaraProactive

Read more:

Quantum Computing strikes technology partnership with Splunk - Proactive Investors USA & Canada

Faster, better, stronger: The next stage of global communications networks – Siliconrepublic.com

Prof Bogdan Staszewski from UCDs IoE2 Lab looks at the future of global communications, from faster networks and more powerful computing to the challenges of energy and cybersecurity.

I am an engineer and an engineers job is to design new solutions for building and making things. Engineers concern ourselves with what goes on below the surface, with the building blocks that make up the world in which we live and work, which is constantly evolving.

As electrical and electronics engineers, my colleagues and I work in a microscopic world of integrated circuits the hardware at the deepest level of the networks with which we interact every day and on which we have come to rely.

From global communications to the movement of money, we rely on the fast and secure transmission of quintillions of bits of data every day

Life today revolves around these networks. From global communications to the movement of money, we rely on the fast and secure transmission of quintillions of bits of data every day. And as technological and economic progress is made, there are ever more demands for capacity in these networks, and for ever greater speed, efficiency and security.

The possibilities created by increased connectedness has led to simple but profound challenges. In network terms, how to send the greatest amount of data in the shortest time while reducing the power requirement and cost, is chief among them.

Internet of things (IoT) networks are helping to address major societal challenges. Water regulation in agriculture in drought regions such as California, and dyke and canal infrastructure management in the Netherlands are just two examples. The systems underpinned by networks of sensors and microprocessors, capable of wireless connectivity and energy scavenging have vastly improved efficiency and delivered numerous benefits.

We are looking to even more advanced applications of these technologies, such as autonomously driven vehicles and robotic surgery. We are designing technology that could either completely replace humans or watch and take over when the driver or surgeon gets too tired or distracted.

We are envisaging vehicles that can communicate among themselves and a traffic coordinator to ensure smooth traffic flow with no need for traffic lights. We are preparing for autonomous operating rooms where robotic surgeons can be directed remotely by human surgeons in another country.

This is technology that could deliver superior and safer performance than error-prone human operation, but which is entirely dependent on unimpeachable network speed, efficiency and security that has not yet been achieved.

It is predicted that connected autonomously driven vehicles will eliminate traffic and accidents. We can imagine insurance premiums going down substantially. Of course, we can also imagine an utter disaster if a hacker was able to sneak into these networks, or if an uplink failed at the wrong moment while crucial information was being transmitted.

Hence, the network must be super fast, super secure and have enough bandwidth.

The view from the core of this technology offers a unique perspective on these challenges. Like physicists and geneticists, electrical and electronics engineers look for answers in ever smaller parts inside our networks.

Energy supply and consumption is at the heart of big societal challenges and so too is it one of the most critical considerations for IoT applications. My colleagues and I in the IoE2 Lab at University College Dublin are currently tackling this problem using the latest nanoscale CMOS (complementary metal oxide semiconductor) technologies, in pursuit of a common ultra-low-power system-on-chip hardware platform.

This means an integrated computer and electronics system containing a CPU, memory, and digital, analog, mixed-signal and radio frequency signal processing functions all on one microchip.

Prof Bogdan Staszewski. Image: UCD

As an aside, theres a lot of interest in radio frequency integrated circuits (RFIC) research now because it offers a huge cost benefit for system-on-chip solutions and this will only grow along with the pervasiveness of wireless capabilities in electronics.

Success in this research knows no pinnacle, it is just constantly evolving. We started with 1G and 2G wireless communication. Then came 3G and 4G. Nowadays the carriers are installing 5G networks, but researchers are working on 6G even though there is no agreement about what it will be. Thats the journey that makes us all excited.

The focus will remain on reducing power consumption and increasing performance, so that we can move towards IoT network applications that can perform more and more complex tasks. Power and capacity are key.

The need to economise power consumption is well understood, for a variety of practical, environmental and socio-economic reasons. Data, however, is a less familiar commodity in our world, in spite of the volume we generate on a daily basis, almost universally. And IoT is also greatly accelerating the demands for bandwidth in our networks, which in turn creates issues around equality of access and the enabling of future technology.

At IoE2, were looking at the problem of so many wireless devices coexisting in extremely congested networks, and the solution is cooperative wireless.

Like physicists and geneticists, electrical and electronics engineers look for answers in ever smaller parts inside our networks

Cooperative networks are at the foundation of IoT. At the system level, this means algorithms, components and software needed to make them energy and bandwidth-efficient. But at the physical layer beneath, we need hugely flexible nodes that can operate in an intelligent and cooperative manner.

To put this in context, a single ant cannot possibly do anything useful but the whole colony of ants are physically able to lift an elephant if they work in collaboration. Even a simple IoT node can do wonders if connected to a large network.

For instance, Swarms constellation of nanosatellites has helped harness the potential of IoT networks and their thousands of devices and billions of bits of data. Each nanosatellite is small and rather dumb but, in collaboration with others, they can execute quite sophisticated tasks and at a fraction of the cost of existing networks linked to broadband internet satellites.

Of course, enhancing capacity and enabling technology also requires enhanced security, especially as our networks become capable of storing more and more data.

We have found ways to increase security at the sub-system level, by creating tamper-proof ROM (read-only memory) and microchips that cannot be reverse engineered. We make increasingly sophisticated chips and memory that are perfected to be error-free and operable throughout their lifetime without updates or patches.

But the journey to advance and secure our networks has passed beyond the world of microelectronics, into the quantum world a world of the sub-atomically small. It would be fair to say this is the next real game-changer for ICT and will even surpass the invention of the integrated circuit itself.

While quantum computing will probably remain aloof from most people, the technology arising from its development will have major implications for society and for the evolution of communications and future networks.

The eventual growing use of quantum computing will render normal encryption virtually useless, creating the need for a global rewrite of our networks security

In simple terms, by exploiting quantum mechanics, a quantum computer takes mere seconds or minutes to crack an algorithm that classical computers would take lifetimes to crack. The power of this technology is transformational. It underpins the only form of communication that is provably unhackable and uninterceptable, heralding a new age of data security.

However, the development of quantum technologies will drive quantum communication and destabilise traditional networks. While only the military and proverbial Swiss banks have the need of these super secure communications for now, the eventual growing use of quantum computing will render normal encryption virtually useless, creating the need for a global rewrite of our networks security.

This technology is only a few years away. And even though the major hype of research remains on quantum computing rather than its application in other fields such as communications, its arrival will profoundly change the world as we know it.

Until then, all the possibilities of our future networks will rely on us building upon current technologies to make the communication pipe bigger and cheaper making our networks better, faster, with less power.

By Prof Bogdan Staszewski

Prof Bogdan Staszewski is a professor of electronic circuits at the UCD School of Electrical and Electronic Engineering and Delft University of Technology in the Netherlands. He is part of the IoE2 Lab within the UCD Centre for Internet of Things Engineering and co-founder of Equal1 Labs, conducting research to build the worlds first practical single-chip CMOS quantum computer.

View post:

Faster, better, stronger: The next stage of global communications networks - Siliconrepublic.com

Quantum Computing for Everyone – The Startup – Medium

Qubits are exponentially faster than bits in several computing problems, such as database searches and factoring (which, as we will discuss soon, may break your Internet encryption).

An important thing to realize is that qubits can hold much more information than a bit can. One bit holds the same amount of information as one qubit they can both only hold one value. However, four bits must be used to store the same amount of information as two qubits. A two-qubit system in equal superposition holds values for four states, which on a classical computer, would need at least four bits to hold. Eight bits are needed to store the same amount of information as three qubits, since a three-qubit system can store eight states 000, 001, 010, 011, 100, 101, 110, and 111. This pattern continues.

The below graph provides a visual for the computing power of qubits. The x-axis represents the number of qubits used to hold a certain amount of information. The blue lines y represents the number of bits needed to hold the same amount of information as the number of qubits (x-axis), or 2 to the power of x. The red lines y represents the number of qubits needed to hold the same amount of information as the number of qubits in the x-axis (y=x).

Imagine the exponential speedup quantum computing can provide! A gigabyte (8E+09 bits) worth of information can be represented with log(8E+09)/log(2) = 33 (rounded up from 32.9) qubits.

Quantum computers are also great at factoring numbers which leads us to RSA encryption. The security protocol that secures Medium and probably any other website youve been on is known as RSA encryption. It relies on the fact that with current computing resources, it would take a very, very long time to factor a 30+-digit number m that has only one solution namely, p times q, where both p and q are large prime numbers. However, dividing m by p or q is computationally much easier, and since m divided by q returns p and vice versa, it provides a quick key verification system.

A quantum algorithm called Shors algorithm has shown exponential speedup in factoring numbers, which could one day break RSA encryption. But dont buy into the hype yet as of this writing, the largest number factored by quantum computers is 21 (into 3 and 7). The hardware has not been developed yet for quantum computers to factor 30-digit numbers or even 10-digit numbers. Even if quantum computers one day do break RSA encryption, a new security protocol called BB84 that relies on quantum properties is verified safe from quantum computers.

So will quantum computers ever completely replace the classical PC? Not in the forseeable future.

Quantum computing, while developing very rapidly, is still in an infantile stage, with research only being conducted semi-competitively by large corporations like Google, Microsoft, and IBM. Much of the hardware to accelerate quantum computing is not currently available. There are several obstacles to a quantum future, of which a major one is addressing gate errors and maintaining integrity of a qubits state.

However, given the amount of innovation that has happened in the past few years, it seems inevitable during our lifetimes that quantum computing will make huge strides. In addition, complexity theory has shown that there are several cases where classical computers perform better than quantum computers. IBM quantum computer developers state that quantum computing will probably never completely eliminate classical computers. Instead, in the future we may see a hybrid chip that relies on quantum transistors for certain tasks and classical transistors for others, depending on which one is more appropriate.

Go here to read the rest:

Quantum Computing for Everyone - The Startup - Medium

UC Riverside to lead scalable quantum computing project using 3D printed ion traps – 3D Printing Industry

UC Riverside (UCR) is set to lead a project focused on enabling scalable quantum computing after winning a $3.75 million Multicampus-National Lab Collaborative Research and Training Award.

The collaborative effort will see contributions from UC Berkeley, UCLA and UC Santa Barbara, with UCR acting as project coordinator.

Scalable quantum computing

Quantum computing is currently in its infancy but it is expected to stretch far beyond the capabilities of conventional computing in the coming years. Intensive tasks such as modeling complex processes, finding large prime numbers, and designing new chemical compounds for medical use are what quantum computers are expected to excel at.

Quantum information is stored on quantum computers in the form of quantum bits, or qubits. This means that quantum systems can exist in two different states simultaneously as opposed to conventional computing systems which only exist in one state at a time. Current quantum computers are limited in their qubits, however, so for quantum computing to realize its true potential, new systems are going to have to be scalable and include many more qubits.

The goal of this collaborative project is to establish a novel platform for quantum computing that is truly scalable up to many qubits, said Boerge Hemmerling, an assistant professor of physics and astronomy at UC Riverside and the lead principal investigator of the three-year project. Current quantum computing technology is far away from experimentally controlling the large number of qubits required for fault-tolerant computing. This stands in large contrast to what has been achieved in conventional computer chips in classical computing.

3D printed ion trap microstructures

The research team will use advanced 3D printing technology, available at Lawrence Livermore National Laboratory, to fabricate microstructure ion traps for the new quantum computers. Ions are used to store qubits and quantum information is transferred when these ions move in their traps. According to UCR, trapped ions have the best potential for realizing scalable quantum computing.

Alongside UCR, UC Berkeley will enable high-fidelity quantum gates with the ion traps. UCLA will integrate fiber optics with the ion traps, UC Santa Barbara will put the traps through trials in cryogenic environments and demonstrate shuttling of ion strings while the Lawrence Berkeley National Laboratory will be used to characterize and develop new materials. The project coordinator, UCR, will develop simplified cooling schemes and research the possibility of trapping electrons with the traps.

We have a unique opportunity here to join various groups within the UC system and combine their expertise to make something bigger than a single group could achieve, Hemmerling stated. We anticipate that the microstructure 3D printed ion traps will outperform ion traps that have been used to date in terms of the storage time of the ions and ability to maintain and manipulate quantum information.

He adds, Most importantly, our envisioned structures will be scalable in that we plan to build arrays of interconnected traps, similar to the very successful conventional computer chip design. We hope to establish these novel 3D-printed traps as a standard laboratory tool for quantum computing with major improvements over currently used technology.

Hemmerlings concluding remarks explain that many quantum computing approaches, while very promising, have fallen short of providing a scalable platform that is useful for processing complex tasks. If an applicable machine is to be built, new routes must be considered, starting with UCRs scalable computing project.

Early quantum technology work involving 3D printing has paved the way for UCRs future project. When cooled to near 0K, the quantum characteristics of atomic particles start to become apparent. Just last year, additive manufacturing R&D company Added Scientific 3D printed the first vacuum chamber capable of trapping clouds of cold atoms. Elsewhere, two-photon AM system manufacturer Nanoscribe introduced a new machine, the Quantum X, with micro-optic capabilities. The company expects its system to be useful in advancing quantum technology to the industrial level.

The nominations for the 2020 3D Printing Industry Awards are now open. Who do you think should make the shortlists for this years show? Have your say now.

Subscribe to the 3D Printing Industry newsletter for the latest news in additive manufacturing. You can also stay connected by following us on Twitter and liking us on Facebook.

Looking for a career in additive manufacturing? Visit 3D Printing Jobs for a selection of roles in the industry.

Featured image showsUniversity of California, Riverside campus. Photo via UCR.

See the original post here:

UC Riverside to lead scalable quantum computing project using 3D printed ion traps - 3D Printing Industry

Top AI Announcements Of The Week: TensorFlow Quantum And More – Analytics India Magazine

AI is one of the most happening domains in the world right now. It would take a lifetime to skim through all the machine learning research papers released till date. As the AI keeps itself in the news through new releases of frameworks, regulations and breakthroughs, we can only hope to get the best of the lot.

So, here we have a compiled a list of top exciting AI announcements released over the past one week:

Late last year, Google locked horns with IBM in their race for quantum supremacy. Though the news has been around how good their quantum computers are, not much has been said about the implementation. Today, Google brings two of their most powerful frameworks Tensorflow and CIRQ together and releases TensorFlow Quantum, an open-source library for the rapid prototyping of quantum ML models.

Google AI team has joined hands with the University of Waterloo, X, and Volkswagen, announced the release of TensorFlow Quantum (TFQ).

TFQ is designed to provide the developers with the tools necessary for assisting the quantum computing and machine learning research communities to control and model quantum systems.

The team at Google have also released a TFQ white paper with a review of quantum applications. And, each example can be run in-browser via Colab from this research repository.

A key feature of TensorFlow Quantum is the ability to simultaneously train and execute many quantum circuits. This is achieved by TensorFlows ability to parallelise computation across a cluster of computers, and the ability to simulate relatively large quantum circuits on multi-core computers.

As the devastating news of COVID-19 keeps rising at an alarming rate, the AI researchers have given something to smile about. DeepMind, one of the premier AI research labs in the world, announced last week, that they are releasing structure predictions of several proteins that can promote research into the ongoing research around COVID-19. They have used the latest version of AlphaFold system to find these structures. AlphaFold is one of the biggest innovations to have come from the labs of DeepMind, and after a couple of years, it is exhilarating to see its application in something very critical.

As the pursuit to achieve human-level intelligence in machines fortifies, language modeling will keep on surfacing till the very end. One, human language is innately sophisticated, and two, training language models from scratch is exhaustive.

The last couple of years has witnessed a flurry of mega releases from the likes of NVIDIA, Microsoft and especially Google. As BERT topped the charts through many of its variants, Google now announces ELECTRA.

ELECTRA has the benefits of BERT but more efficient learning. They also claim that this novel pre-training method outperforms existing techniques given the same compute budget.

The gains are particularly strong for small models; for example, a model trained on one GPU for four days outperformed GPT (trained using 30x more compute) on the GLUE natural language understanding benchmark.

China has been the worst-hit nation of all the COVID-19 victims. However, two of the biggest AI breakthroughs have come from the Chinese soil. Last month, Baidu announced how its toolkit brings down the prediction time. Last week, another Chinese giant, Alibaba announced that its new AI system has an accuracy of 96% in detecting the coronavirus from the CT scan of the patients. Alibabas founder Jack Ma has fueled the vaccine development efforts of his team with a $2.15 M donation.

Facebook AI has released its in-house feature of converting a two-dimensional photo into a video byte that gives the feel of having a more realistic view of the object in the picture. This system infers the 3D structure of any image, whether it is a new shot just taken on an Android or iOS device with a standard single camera, or a decades-old image recently uploaded to a phone or laptop.

The feature has been only available on high-end phones through the dual-lens portrait mode. But, now it will be available on every mobile device even with a single, rear-facing camera. To bring this new visual format to more people, the researchers at Facebook used state-of-the-art ML techniques to produce 3D photos from virtually any standard 2D picture.

One significant implication of this feature can be an improved understanding of 3D scenes that can help robots navigate and interact with the physical world.

As the whole world focused on the race to quantum supremacy between Google and IBM, Honeywell silently has been building, as it claims, the most powerful quantum computer yet. And, it plans to release this by the middle of 2020.

Thanks to a breakthrough in technology, were on track to release a quantum computer with a quantum volume of at least 64, twice that of the next alternative in the industry. There are a number of industries that will be profoundly impacted by the advancement and ultimate application of at-scale quantum computing, said Tony Uttley, President of Honeywell Quantum Solutions in the official press release.

The outbreak of COVID-19 has created a panic globally and rightfully so. Many flagship conferences have been either cancelled or have been moved to a virtual environment.

Nvidias flagship GPU Technology Conference (GTC), which was supposed to take place in San Francisco in the last week of March was cancelled due to fears of the COVID-19 coronavirus.

Whereas, Google Cloud also has cancelled its upcoming event, Google Cloud Next 20, which was slated to take place on April 6-8 at the Moscone Center in San Francisco. Due to the growing concern around the coronavirus (COVID-19), and in alignment with the best practices laid out by the CDC, WHO and other relevant entities, Google Cloud has decided to reimagine Google Cloud Next 20, the company stated on its website.

One of the popular conferences for ML researchers, ICLR2020 too, has announced that they are cancelling its physical conference this year due to growing concerns about COVID-19. They are shifting this event to a fully virtual conference.

ICLR authorities also issued a statement saying that all accepted papers at the virtual conference will be presented using a pre-recorded video.



Top AI Announcements Of The Week: TensorFlow Quantum And More - Analytics India Magazine

Deltec Bank, Bahamas – Quantum Computing Will have Positive Impacts on Portfolio Optimization, Risk Analysis, Asset Pricing, and Trading Strategies -…

Quantum computing is expected to be the new technology, fully integrated with the financial sector within five to ten years. This form of computer, also known as supercomputers, are capable of highly advanced processing power that takes in massive amounts of data to solve a problem in a fraction of the time it would for the best traditional computer on the market to resolve.

Traditional Computer vs. Quantum Computing

A typical computer today stores information in the form of bits. These are represented in the binary language (0s and 1s). In quantum computing, the bits are known as Qubits and will take on the processing of similar input but rather than break it down to 0s and 1s will break the data down significantly greater where the possibilities of computational speed can be almost immeasurable.

Quantum Computing in Banking

Let's examine personal encryption in banking for example. Using a security format called RSA-2048, traditional computers would be able to decrypt the security algorithm in about 1,034 steps. With our best computers on the market, even with a processor capable of performing a trillion calculations per second, these steps translate to 317 billion years to break the secure code. While it is possible, it is not practical for a cyber-criminal to make it worthwhile.

A quantum computer, on the other hand, would be able to resolve this problem in about 107 steps. With a basic quantum computer running at one million calculations per second, this translates to ten seconds to resolve the problem.

While this example centered on breaking complex security, many other use cases can emerge from the use of quantum computing.

Trade Transaction Settlements

Barclays bank researchers have been working on a proof of concept regarding the transaction settlement process. As settlements can only be worked on a transaction-by-transaction basis, they can easily queue up only to be released in batches. When a processing window opens, as many trades as possible are settled.

Complex by their very nature, Traders can end up tapping into funds prior to the transaction being cleared. They will only be settled if the funds are available or if a collateral credit facility was arranged.

As you could probably handle a small number of trades in your head, you would need to rely on a computer after about 10-20 transactions. The same can be described for our current computational power in that it is now nearing the point where it will need more and more time to resolve hundreds of trades at a time.

With quantum computing using a seven-qubit system, it would be able to run a greater amount of complex trades in the same time it would for a traditional system to complete the trades. It would take the equivalent of about two hundred traditional computers to match the speed.

Simulating a Future Product Valuation

Researchers at JP Morgan were working on a concept that simulates the future value of a financial product. The team is testing quantum computers to perform complex intensive pricing calculations that normally take traditional computer hours to complete. This is a problem as each year greater complexity is added via newer algorithms, getting to the point where it is nearing an impossibility to calculate in a practical sense.

The research team has discovered that using quantum computing resulted in finding a resolution to the problem in mere seconds.

Final Thoughts

Banks are working on successful tests today with quantum computing to resolve extreme resource-intensive calculations for financial problem scenarios. Everything from trading, fraud, AML, etc. this is a technology not to be overlooked.

According to Deltec Bank, Bahamas - "Quantum Computing will have positive impacts on portfolio optimization, risk analysis, asset pricing, and trading strategies is just the tip of the iceberg of what this technology could provide."

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com. The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients' unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management.

Media Contact

Company Name: Deltec International Group

Contact Person: Media Manager

Email: rtrehan@deltecial.com

Phone: 242 302 4100

Country: Bahamas

Website: https://www.deltecbank.com/

Source: http://www.abnewswire.com


More here:

Deltec Bank, Bahamas - Quantum Computing Will have Positive Impacts on Portfolio Optimization, Risk Analysis, Asset Pricing, and Trading Strategies -...

NIST Works on the Industries of the Future in Buildings from the Past – Nextgov

The presidents budget request for fiscal 2021 proposed $738 million to fund the National Institutes of Science and Technology, a dramatic reduction from the more than $1 billion in enacted funds allocated for the agency this fiscal year.

The House Science, Space and Technology Committees Research and Technology Subcommittee on Wednesday held a hearing to hone in on NISTs reauthorizationbut instead of focusing on relevant budget considerations, lawmakers had other plans.

We're disappointed by the president's destructive budget request, which proposes over a 30% cut to NIST programs, Subcommittee Chairwoman Rep. Haley Stevens, D-Mich., said at the top of the hearing. But today, I don't want to dwell on a proposal that we know Congress is going to reject ... today I would like this committee to focus on improving NIST and getting the agency the tools it needs to do better, to do its job.

Per Stevens suggestion, Under Secretary of Commerce for Standards and Technology and NIST Director Walter Copan reflected on some of the agencys dire needs and offered updates and his view on a range of its ongoing programs and efforts.

NISTs Facilities Are in Bad Shape

President Trumps budget proposal for fiscal 2021 requests only $60 million in funds for facility construction, which is down from the $118 million enacted for fiscal 2020 and comes at a time when the agencys workspaces need upgrades.

Indeed the condition of NIST facilities are challenging, Copan explained. Over 55% of NIST's facilities are considered in poor to critical condition per [Commerce Department] standards, and so it does provide some significant challenges for us.

Some of the agencys decades-old facilities and infrastructures are deteriorating and Copan added that hed recently heard NISTs deferred maintenance backlog has hit more than $775 million. If the lawmakers or public venture out to visit some of the agencys facilities, you'll see the good, the bad, and the embarrassingly bad, he said. Those conditions are a testament to the resilience and the commitment of NISTs people, that they can work in sometimes challenging, outdated environments, Copan said.

The director noted that there have already been some creative solutions proposed to address the issue, including the development of a federal capital revolving fund. The agency is also looking creatively at the combination of maintenance with lease options for some of its facilities, in hopes that it can then move more rapidly by having its officials cycle out of laboratories to launch rebuilding and renovation processes.

It's one of my top priorities as the NIST director to have our NIST people work in 21st-century facilities that we can be proud of and that enable the important work of NIST for the nation, Copan said.

Advancing Efforts in Artificial Intelligence and Quantum Computing

The presidents budget request placed a sharp focus on industries of the future, which will be powered by many emerging technologies, and particularly quantum computing and AI.

During the hearing and in his written testimony, Copan highlighted some of NISTs work in both areas. The agency has helped shape an entire generation of quantum science, over the last century, and a significant portion of quantum scientists from around the globe have trained at the agencys facilities. Some of NISTs more recent quantum achievements include supporting the development of a quantum logic clock and helping steer advancements in quantum simulation. Following a recent mandate from the Trump administration, the agency is also in the midst of instituting the Quantum Economic Development Consortium, or QEDC, which aims to advance industry collaboration to expand the nations leadership in quantum research and development.

Looking forward, over the coming years NIST will focus a portion of its quantum research portfolio on the grand challenge of quantum networking, Copans written testimony said. Serving as the basis for secure and highly efficient quantum information transmission that links together multiple quantum devices and sensors, quantum networks will be a key element in the long-term evolution of quantum technologies.

Though there were cuts across many areas, the presidents budget request also proposed a doubling of NISTs funding in artificial intelligence and Copan said the technology is already broadly applied across all of the agencys laboratories to help improve productivity.

Going forward and with increased funding, he laid out some of the agencys top priorities, noting that there's much work to be done in developing tools to provide insights into artificial intelligence programs, and there is also important work to be done in standardization, so that the United States can lead the world in the application of [AI] in a trustworthy and ethical manner.

Standardization to Help the U.S. Lead in 5G

Rep. Frank Lucas, R-Okla., asked Copan to weigh in on the moves China is making across the fifth-generation wireless technology landscape, and the moves the U.S. needs to make to leadnot just competein that specific area.

We have entered in the United States, as we know, a hyper-competitive environment with China as a lead in activities related to standardization, Copan responded.

The director said that officials see, in some ways, that the standardization process has been weaponized, where the free market economy that is represented by the United States, now needs to lead in more effective coordination internally and incentivize industry to participate in the standards process. Though U.S. officials have already seen those rules of fair play bent or indeed broken by other players, NIST and others need to help improve information sharing across American standards-focused stakeholders, which could, in turn, accelerate adoption around the emerging technology.

We want the best technologies in the world to win and we want the United States to continue to be the leader in not only delivering those technologies, but securing the intellectual properties behind them and translating those into market value, he said.

Continue reading here:

NIST Works on the Industries of the Future in Buildings from the Past - Nextgov

Archer Materials" patent application received by World Intellectual Property Organisation – Proactive Investors Australia

The company is developing materials in quantum computing, biotechnology, and lithium-ion batteries.

() has confirmed thatthe patent application filed under the Patent Cooperation Treaty (PCT) to protect and commercialise its graphene biosensor technology intellectual property has been received by the World Intellectual Property Organisation (WIPO).

Acknowledgement of receipt by WIPO concludes the PCT application lodgement process and confirms the International Patent Application is formally compliant with the PCT prosecution procedure and has met the deadline to avoid abandonment of the application.

The company has continued to progressthe development of its 12CQ technology and ison-track performing quantum measurements required to build an operational room-temperature qubit processor (chip) prototype.

As part of this work, the company has joined the Sydney Knowledge Hub,a co-working space for research-based organisations that collaborate with the University of Sydney,to strategically engage with researchers in the Australian quantum computing economy.

A collaboration agreement with the University of NSW Sydney also now includes access to world-class infrastructure for quantum materials characterisation.

View post:

Archer Materials" patent application received by World Intellectual Property Organisation - Proactive Investors Australia

The view of quantum threats from the front lines – JAXenter

The future is here. Or just about. After a number of discoveries, researchers have proven that quantum computing is possible and on its way. The wider world did not pause long on this discovery: Goldman Sachs, Amazon, Google, and IBM have just announced their own intentions to embark on their own quantum developments.

Now that its within our reach we have to start seriously considering what that means in the real world. Certainly, we all stand to gain from the massive benefits that quantum capabilities can bring, but so do cybercriminals.

Scalable quantum computing will defeat much of modern-day encryption, such as the RSA 2048 bit keys, which secure computer networks everywhere. The U.S. National Institute of Standards and Technology says as much, projecting that quantum in this decade will be able to break the protocols on which the modern internet relies.

The security profession hasnt taken the news lying down either. Preparations have begun in earnest. The DigiCert 2019 Post Quantum Cryptography (PQC) Survey aimed to examine exactly how companies were doing. Researchers surveyed 400 enterprises, each with 1,000 or more employees, across the US, Germany and Japan to get answers. They also conducted a focus group of nine different IT managers to further reveal those preparations.

SEE ALSO:DevSecOps Panel Best DevOps Security Practices & Best Tools

An encouraging development is that 35 percent of respondents already have a PQC budget, and a further 56 percent are discussing one in their organisations. Yet, many are still very early in the process of PQC planning. An IT manager within a manufacturing company said, We have a budget for security overall. Theres a segment allotted to this, but its not to the level or expense that is appropriate and should be there yet.

The time to start preparing, including inquiring of your vendors readiness for quantum computing threats, is now. One of the respondents, an IT Security manager at a financial services company, told surveyors, Were still in the early discussion phases because were not the only ones who are affected. There are third party partners and vendors that were in early discussions with on how we can be proactive and beef up our security. And quantum cryptology is one of the topics that we are looking at.

Others expanded upon that, noting that their early preparations heavily involve discussing the matter with third parties and vendors. Another focus group member, an IT manager at an industrial construction company, told the group, We have third party security companies that are working with us to come up with solutions to be proactive. So obviously, knock on wood, nothing has happened yet. But we are definitely always proactive from a security standpoint and were definitely trying to make sure that were ready once a solution is available.

Talking to your vendors and third parties should be a key part of any organisations planning process. To that end, organisations should be checking whether their partners will keep supporting and securing customers operations into the age of quantum.

The data itself was still at the centre of respondents minds when it came to protection from quantum threats, and when asked what they were focusing on in their preparations, respondents said that above all they were monitoring their own data. One respondent told us, The data is everything for anybody thats involved in protecting it. And so you just have to stay on top of it along with your vendors and continue to communicate.

One of the prime preparatory best practices that respondents called upon was monitoring. Knowing what kind of data flows within your environment, how its used and how its currently protected are all things that an enterprise has to find out as they prepare.

SEE ALSO:As quantum computing draws near, cryptography security concerns grow

To be sure, overhauling an enterprises cryptographic infrastructure is no small feat, but respondents listed understanding their organisations level of crypto agility as a priority. Quantum might be a few years off, but becoming crypto agile may take just as long.

Organisations will have to plan for a system which can easily swap out, integrate and change cryptographic algorithms within an organisation. Moreover, it must be able to do so quickly, cheaply and without any significant changes to the broader system. Practically, this means installing automated platforms which follow your cryptographic deployments so that you can remediate, revoke, renew, reissue or otherwise control any and all of your certificates at scale.

Many organisations are still taking their first tentative steps, and others have yet to take any. Now is the time for organisations to be assessing their deployments of crypto and digital certificates so they have proper crypto-agility and are ready to deploy quantum-resistant algorithms soon rather than being caught lacking when it finally arrives.

Follow this link:

The view of quantum threats from the front lines - JAXenter

Why Quantum Computing Gets Special Attention In The Trump Administration’s Budget Proposal – Texas Standard

The Trump administrations fiscal year 2021 budget proposal includes significant increases in funding for artificial intelligence and quantum computing, while cutting overall research and development spending. If Congress agrees to it, artificial intelligence, or AI, funding would nearly double, and quantum computing would receive a 50% boost over last years budget, doubling in 2022, to $860 million. The administration says these two fields of research are important to U.S. national security, in part because China also invests heavily in these fields.

Quantum computing uses quantum mechanics to solve highly complex problems more quickly than they can be solved by standard or classical computers. Though fully functional quantum computers dont yet exist, scientists at academic institutions, as well as at IBM, Google and other companies, are working to build such systems.

Scott Aaronson is a professor of computer science and the founding director of the Quantum Information Center at the University of Texas at Austin. He says applications for quantum computing include simulation of chemistry and physics problems. These simulations enable scientists to design new materials, drugs, superconductors and solar cells, among other applications.

Aaronson says the governments role is to support basic scientific research the kind needed to build and perfect quantum computers.

We do not yet know how to build a fully scalable quantum computer. The quantum version of the transistor, if you like, has not been invented yet, Aaronson says.

On the software front, researchers have not yet developed applications that take full advantage of quantum computings capabilities.

Thats often misrepresented in the popular press, where its claimed that a quantum computer is just a black box that does everything, Aaronson says.

Competition between the U.S. and China in quantum computing revolves, in part, around the role such a system could play in breaking the encryption that makes things secure on the internet.

Truly useful quantum computing applications could be as much as a decade away, Aaronson says. Initially, these tools would be highly specialized.

The way I put it is that were now entering the very, very early, vacuum-tube era of quantum computers, he says.

More here:

Why Quantum Computing Gets Special Attention In The Trump Administration's Budget Proposal - Texas Standard

Quantum internet: the next global network is already being laid – The Conversation UK

Google reported a remarkable breakthrough towards the end of 2019. The company claimed to have achieved something called quantum supremacy, using a new type of quantum computer to perform a benchmark test in 200 seconds. This was in stark contrast to the 10,000 years that would supposedly have been needed by a state-of-the-art conventional supercomputer to complete the same test.

Despite IBMs claim that its supercomputer, with a little optimisation, could solve the task in a matter of days, Googles announcement made it clear that we are entering a new era of incredible computational power.

Yet with much less fanfare, there has also been rapid progress in the development of quantum communication networks, and a master network to unite them all called the quantum internet. Just as the internet as we know it followed the development of computers, we can expect the quantum computer to be accompanied by the safer, better synchronised quantum internet.

Like quantum computing, quantum communication records information in what are known as qubits, similar to the way digital systems use bits and bytes. Whereas a bit can only take the value of zero or one, a qubit can also use the principles of quantum physics to take the value of zero and one at the same time. This is what allows quantum computers to perform certain computations very quickly. Instead of solving several variants of a problem one by one, the quantum computer can handle them all at the same time.

These qubits are central to the quantum internet because of a property called entanglement. If two entangled qubits are geographically separated (for instance, one qubit in Dublin and the other in New York), measurements of both would yield the same result. This would enable the ultimate in secret communications, a shared knowledge between two parties that cannot be discovered by a third. The resulting ability to code and decode messages would be one of the most powerful features of the quantum internet.

There will be no shortage of commercial applications for these advanced cryptographic mechanisms. The world of finance, in particular, looks set to benefit as the quantum internet will lead to enhanced privacy for online transactions and stronger proof of the funds used in the transaction.

Recently, at the CONNECT Centre in Trinity College Dublin, we successfully implemented an algorithm that could achieve this level of security. That this took place during a hackathon a sort of competition for computer programmers shows that even enthusiasts without detailed knowledge of quantum physics can create some of the building blocks that will be needed for the quantum internet. This technology wont be confined to specialist university departments, just as the original internet soon outgrew its origins as a way to connect academics around the world.

But how could this quantum internet be built anytime soon when we currently can only build very limited quantum computers? Well, the devices in the quantum internet dont have to be completely quantum in nature, and the network wont require massive quantum machines to handle the communication protocols.

One qubit here and there is all a quantum communication network needs to function. Instead of replacing the current infrastructure of optical fibres, data centres and base stations, the quantum internet will build on top of and make maximum use of the existing, classical internet.

With such rapid progress being made, quantum internet technology is set to shape the business plans of telecom companies in the near future. Financial institutions are already using quantum communication networks to make inter-bank transactions safer. And quantum communication satellites are up and running as the first step to extending these networks to a global scale.

The pipes of the quantum internet are effectively being laid as you read this. When a big quantum computer is finally built, it can be plugged into this network and accessed on the cloud, with all the privacy guarantees of quantum cryptography.

What will the ordinary user notice when the enhanced cryptography of the quantum internet becomes available? Very little, in all likelihood. Cryptography is like waste management: if everything works well, the customer doesnt even notice.

In the constant race of the codemakers and codebreakers, the quantum internet wont just prevent the codebreakers taking the lead. It will move the race track into another world altogether, with a significant head start for the codemakers. With data becoming the currency of our times, the quantum internet will provide stronger security for a new valuable commodity.

Read the original here:

Quantum internet: the next global network is already being laid - The Conversation UK