QC Ware Announces Q2B22 Tokyo To Be Held July 13-14 – HPCwire

PALO ALTO, Calif., June 28, 2022 QC Ware, a leading quantum software and services company, today announced the inaugural Q2B22 Tokyo Practical Quantum Computing, to be held exclusively in person at The Westin Tokyo in Japan on July 13- 14, 2022. Q2B is the worlds largest gathering of the quantum computing community, focusing solely on quantum computing applications and driving the discourse on quantum advantage and commercialization. Registration and other information onQ2B22 Tokyo is available athttp://q2b.jp.

Q2B22 Tokyo will feature top academics, industry end users, government representatives, and quantum computing vendors from around the world.

Japan has led the way with ground-breaking research on quantum computing, said Matt Johnson, CEO of QC Ware. In addition, the ecosystem includes some of Japans largest enterprises, forward-thinking government organizations, and a thriving venture- backed startup community. Im excited to be able to connect the Japanese and international quantum computing ecosystems at this unique event.

QC Ware has been operating in Japan since 2019 and recently opened up an office in Tokyo.

Q2B22 Tokyo will be co-hosted by QunaSys, a leading Japanese developer company working on innovative algorithms focused on accelerating the development of quantum technology applicability in chemistry and sponsored by IBM Quantum.

Japans technology ecosystem is actively advancing quantum computing. QunaSys is a key player in boosting technology adoption, driving business, government, and academia collaboration to enable the quantum chemistry ecosystem. We are pleased to work with QC Ware and co-host Q2B22 Tokyo bringing Q2B to Japan, said Tennin Yan, CEO of QunaSys.

IBM Quantum has strategically invested in Japan to accelerate an ecosystem of world- class academic, private sector and government partners, including installation of the IBM Quantum System One at the University of Tokyo, and the co-development of the Quantum Innovation Initiative Consortium (QIIC), said Aparna Prabhakar, Vice President, Partners and Alliances, IBM Quantum. We are excited to work with QC Ware and QunaSys to bring experts from a wide variety of quantum computing fields to Q2B22 Tokyo.

Q2B22 Tokyo will feature keynotes from top academics such as:

Other keynotes include:

Japanese and international end-users discussing active quantum initiatives, such as:Automotive:

Materials and Chemistry:

Finance and more:

In addition to IBM Quantum, Q2B22 Tokyo, is sponsored by D-Wave Systems, KeysightTechnologies, NVIDIA, Quantinuum Ltd., Quantum Machines, andStrangeworks, Inc.Other sponsors include:

Q2B has been run by QC Ware since 2017, with the annual flagship event held in Northern Californias Silicon Valley. Q2B Silicon Valley is currently scheduled for December 6-8 at the Santa Clara Convention Center.

About QC Ware

QC Ware is a quantum software and services company focused on ensuring enterprises are prepared for the emerging quantum computing disruption. QC Ware specializes in the development of applications for near-term quantum computing hardware with a team composed of some of the industrys foremost experts in quantum computing. Its growing network of customers includes AFRL, Aisin Group, Airbus, BMW Group, Covestro, Equinor, Goldman Sachs, Itau Unibanco, and Total. QC Ware Forge, the companys flagship quantum computing cloud service, is built for data scientists with no quantum computing background. It provides unique, performant, turnkey quantum computing algorithms. QC Ware is headquartered in Palo Alto, California, and supports its European customers through its subsidiary in Paris and its Asian customers from a Tokyo office. QC Ware also organizes Q2B, the largest annual gathering of the international quantum computing community.

Source: QC Ware

See the original post here:

QC Ware Announces Q2B22 Tokyo To Be Held July 13-14 - HPCwire

Quantum computing will revolutionize every large industry – CTech

Israeli Team8 venture group officially opened this years Cyber Week with an event that took place in Tel Aviv on Sunday. The event, which included international guests and cybersecurity professionals, showcased the country and the industry as a powerhouse in relation to Startup Nation.

Opening remarks were made by Niv Sultan, star of Apple TVs Tehran, who also moderated the event. She then welcomed Gili Drob-Heinstein, Executive Director at the Blavatnik Interdisciplinary Cyber Research Center (ICRC) at Tel Aviv University, and Nadav Zafrir, Co-founder of Team8 and Managing Partner of Team8 Platform to the stage.

I would like to thank the 100 CSOs who came to stay with us, Zafrir said on stage. Guests from around the world had flown into Israel and spent time connecting with one another ahead of the official start of Cyber Week on Monday. Team8 was also celebrating its 8th year as a VC, highlighting the work it has done in the cybersecurity arena.

The stage was then filled with Admiral Mike Rogers and Nir Minerbi, Co-founder and CEO of Classiq, who together discussed The Quantum Opportunity in computing. Classical computers are great, but for some of the most complex challenges humanity is facing, they are not suitable, said Minerbi. Quantum computing will revolutionize every large industry.

Classiq develops software for quantum algorithms. Founded in 2020, it has raised a total of $51 million and is funded by Team8 among other VC players in the space. Admiral Mike Rogers is the Former Director of American agency the NSA and is an Operating Partner at Team8.

We are in a race, Rogers told the large crowd. This is a technology believed to have advantages for our daily lives and national security. I told both presidents I worked under why they should invest billions into quantum, citing the ability to look at multiple qubits simultaneously thus speeding up the ability to process information. According to Rogers, governments have already publicly announced $29 billion of funding to help develop quantum computing.

Final remarks were made by Renee Wynn, former CIO at NASA, who discussed the potential of cyber in space. Space may be the final frontier, and if we do not do anything else than what we are doing now, it will be chaos 100 miles above your head, she warned. On stage, she spoke to the audience about the threats in space and how satellites could be hijacked for nefarious reasons.

Cybersecurity and satellites are so important, she concluded. Lets bring the space teams together with the cybersecurity teams and help save lives.

After the remarks, the stage was then transformed to host the evenings entertainment. Israeli-American puppet band Red Band performed a variety of songs and was then joined by Marina Maximilian, an Israeli singer-songwriter and actress, who shared the stage with the colorful puppets.

The event was sponsored by Meitar, Delloitte, LeumiTech, Valley, Palo Alto, FinSec Innovation Lab, and SentinelOne. It marked the beginning of Cyber Week, a three-day conference hosted by Tel Aviv University that will welcome a variety of cybersecurity professionals for workshops, networking opportunities, and panel discussions. It is understood that this year will have 9,000 attendees, 400 speakers, and host people from 80 different countries.

2 View gallery

Red Band performing 'Seven Nation Army'.

(Photo: James Spiro)

More here:

Quantum computing will revolutionize every large industry - CTech

Quantum Error Correction: Time to Make It Work – IEEE Spectrum

Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.

Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.

The two of us, along with many other researchers involved in quantum computing, are trying to move definitively beyond these preliminary demos of QEC so that it can be employed to build useful, large-scale quantum computers. But before describing how we think such error correction can be made practical, we need to first review what makes a quantum computer tick.

Information is physical. This was the mantra of the distinguished IBM researcher Rolf Landauer. Abstract though it may seem, information always involves a physical representation, and the physics matters.

Conventional digital information consists of bits, zeros and ones, which can be represented by classical states of matter, that is, states well described by classical physics. Quantum information, by contrast, involves qubitsquantum bitswhose properties follow the peculiar rules of quantum mechanics.

A classical bit has only two possible values: 0 or 1. A qubit, however, can occupy a superposition of these two information states, taking on characteristics of both. Polarized light provides intuitive examples of superpositions. You could use horizontally polarized light to represent 0 and vertically polarized light to represent 1, but light can also be polarized on an angle and then has both horizontal and vertical components at once. Indeed, one way to represent a qubit is by the polarization of a single photon of light.

These ideas generalize to groups of n bits or qubits: n bits can represent any one of 2n possible values at any moment, while n qubits can include components corresponding to all 2n classical states simultaneously in superposition. These superpositions provide a vast range of possible states for a quantum computer to work with, albeit with limitations on how they can be manipulated and accessed. Superposition of information is a central resource used in quantum processing and, along with other quantum rules, enables powerful new ways to compute.

Researchers are experimenting with many different physical systems to hold and process quantum information, including light, trapped atoms and ions, and solid-state devices based on semiconductors or superconductors. For the purpose of realizing qubits, all these systems follow the same underlying mathematical rules of quantum physics, and all of them are highly sensitive to environmental fluctuations that introduce errors. By contrast, the transistors that handle classical information in modern digital electronics can reliably perform a billion operations per second for decades with a vanishingly small chance of a hardware fault.

Of particular concern is the fact that qubit states can roam over a continuous range of superpositions. Polarized light again provides a good analogy: The angle of linear polarization can take any value from 0 to 180 degrees.

Pictorially, a qubits state can be thought of as an arrow pointing to a location on the surface of a sphere. Known as a Bloch sphere, its north and south poles represent the binary states 0 and 1, respectively, and all other locations on its surface represent possible quantum superpositions of those two states. Noise causes the Bloch arrow to drift around the sphere over time. A conventional computer represents 0 and 1 with physical quantities, such as capacitor voltages, that can be locked near the correct values to suppress this kind of continuous wandering and unwanted bit flips. There is no comparable way to lock the qubits arrow to its correct location on the Bloch sphere.

Early in the 1990s, Landauer and others argued that this difficulty presented a fundamental obstacle to building useful quantum computers. The issue is known as scalability: Although a simple quantum processor performing a few operations on a handful of qubits might be possible, could you scale up the technology to systems that could run lengthy computations on large arrays of qubits? A type of classical computation called analog computing also uses continuous quantities and is suitable for some tasks, but the problem of continuous errors prevents the complexity of such systems from being scaled up. Continuous errors with qubits seemed to doom quantum computers to the same fate.

We now know better. Theoreticians have successfully adapted the theory of error correction for classical digital data to quantum settings. QEC makes scalable quantum processing possible in a way that is impossible for analog computers. To get a sense of how it works, its worthwhile to review how error correction is performed in classical settings.

Simple schemes can deal with errors in classical information. For instance, in the 19th century, ships routinely carried clocks for determining the ships longitude during voyages. A good clock that could keep track of the time in Greenwich, in combination with the suns position in the sky, provided the necessary data. A mistimed clock could lead to dangerous navigational errors, though, so ships often carried at least three of them. Two clocks reading different times could detect when one was at fault, but three were needed to identify which timepiece was faulty and correct it through a majority vote.

The use of multiple clocks is an example of a repetition code: Information is redundantly encoded in multiple physical devices such that a disturbance in one can be identified and corrected.

As you might expect, quantum mechanics adds some major complications when dealing with errors. Two problems in particular might seem to dash any hopes of using a quantum repetition code. The first problem is that measurements fundamentally disturb quantum systems. So if you encoded information on three qubits, for instance, observing them directly to check for errors would ruin them. Like Schrdingers cat when its box is opened, their quantum states would be irrevocably changed, spoiling the very quantum features your computer was intended to exploit.

The second issue is a fundamental result in quantum mechanics called the no-cloning theorem, which tells us it is impossible to make a perfect copy of an unknown quantum state. If you know the exact superposition state of your qubit, there is no problem producing any number of other qubits in the same state. But once a computation is running and you no longer know what state a qubit has evolved to, you cannot manufacture faithful copies of that qubit except by duplicating the entire process up to that point.

Fortunately, you can sidestep both of these obstacles. Well first describe how to evade the measurement problem using the example of a classical three-bit repetition code. You dont actually need to know the state of every individual code bit to identify which one, if any, has flipped. Instead, you ask two questions: Are bits 1 and 2 the same? and Are bits 2 and 3 the same? These are called parity-check questions because two identical bits are said to have even parity, and two unequal bits have odd parity.

The two answers to those questions identify which single bit has flipped, and you can then counterflip that bit to correct the error. You can do all this without ever determining what value each code bit holds. A similar strategy works to correct errors in a quantum system.

Learning the values of the parity checks still requires quantum measurement, but importantly, it does not reveal the underlying quantum information. Additional qubits can be used as disposable resources to obtain the parity values without revealing (and thus without disturbing) the encoded information itself.

Like Schrdingers cat when its box is opened, the quantum states of the qubits you measured would be irrevocably changed, spoiling the very quantum features your computer was intended to exploit.

What about no-cloning? It turns out it is possible to take a qubit whose state is unknown and encode that hidden state in a superposition across multiple qubits in a way that does not clone the original information. This process allows you to record what amounts to a single logical qubit of information across three physical qubits, and you can perform parity checks and corrective steps to protect the logical qubit against noise.

Quantum errors consist of more than just bit-flip errors, though, making this simple three-qubit repetition code unsuitable for protecting against all possible quantum errors. True QEC requires something more. That came in the mid-1990s when Peter Shor (then at AT&T Bell Laboratories, in Murray Hill, N.J.) described an elegant scheme to encode one logical qubit into nine physical qubits by embedding a repetition code inside another code. Shors scheme protects against an arbitrary quantum error on any one of the physical qubits.

Since then, the QEC community has developed many improved encoding schemes, which use fewer physical qubits per logical qubitthe most compact use fiveor enjoy other performance enhancements. Today, the workhorse of large-scale proposals for error correction in quantum computers is called the surface code, developed in the late 1990s by borrowing exotic mathematics from topology and high-energy physics.

It is convenient to think of a quantum computer as being made up of logical qubits and logical gates that sit atop an underlying foundation of physical devices. These physical devices are subject to noise, which creates physical errors that accumulate over time. Periodically, generalized parity measurements (called syndrome measurements) identify the physical errors, and corrections remove them before they cause damage at the logical level.

A quantum computation with QEC then consists of cycles of gates acting on qubits, syndrome measurements, error inference, and corrections. In terms more familiar to engineers, QEC is a form of feedback stabilization that uses indirect measurements to gain just the information needed to correct errors.

QEC is not foolproof, of course. The three-bit repetition code, for example, fails if more than one bit has been flipped. Whats more, the resources and mechanisms that create the encoded quantum states and perform the syndrome measurements are themselves prone to errors. How, then, can a quantum computer perform QEC when all these processes are themselves faulty?

Remarkably, the error-correction cycle can be designed to tolerate errors and faults that occur at every stage, whether in the physical qubits, the physical gates, or even in the very measurements used to infer the existence of errors! Called a fault-tolerant architecture, such a design permits, in principle, error-robust quantum processing even when all the component parts are unreliable.

A long quantum computation will require many cycles of quantum error correction (QEC). Each cycle would consist of gates acting on encoded qubits (performing the computation), followed by syndrome measurements from which errors can be inferred, and corrections. The effectiveness of this QEC feedback loop can be greatly enhanced by including quantum-control techniques (represented by the thick blue outline) to stabilize and optimize each of these processes.

Even in a fault-tolerant architecture, the additional complexity introduces new avenues for failure. The effect of errors is therefore reduced at the logical level only if the underlying physical error rate is not too high. The maximum physical error rate that a specific fault-tolerant architecture can reliably handle is known as its break-even error threshold. If error rates are lower than this threshold, the QEC process tends to suppress errors over the entire cycle. But if error rates exceed the threshold, the added machinery just makes things worse overall.

The theory of fault-tolerant QEC is foundational to every effort to build useful quantum computers because it paves the way to building systems of any size. If QEC is implemented effectively on hardware exceeding certain performance requirements, the effect of errors can be reduced to arbitrarily low levels, enabling the execution of arbitrarily long computations.

At this point, you may be wondering how QEC has evaded the problem of continuous errors, which is fatal for scaling up analog computers. The answer lies in the nature of quantum measurements.

In a typical quantum measurement of a superposition, only a few discrete outcomes are possible, and the physical state changes to match the result that the measurement finds. With the parity-check measurements, this change helps.

Imagine you have a code block of three physical qubits, and one of these qubit states has wandered a little from its ideal state. If you perform a parity measurement, just two results are possible: Most often, the measurement will report the parity state that corresponds to no error, and after the measurement, all three qubits will be in the correct state, whatever it is. Occasionally the measurement will instead indicate the odd parity state, which means an errant qubit is now fully flipped. If so, you can flip that qubit back to restore the desired encoded logical state.

In other words, performing QEC transforms small, continuous errors into infrequent but discrete errors, similar to the errors that arise in digital computers.

Researchers have now demonstrated many of the principles of QEC in the laboratoryfrom the basics of the repetition code through to complex encodings, logical operations on code words, and repeated cycles of measurement and correction. Current estimates of the break-even threshold for quantum hardware place it at about 1 error in 1,000 operations. This level of performance hasnt yet been achieved across all the constituent parts of a QEC scheme, but researchers are getting ever closer, achieving multiqubit logic with rates of fewer than about 5 errors per 1,000 operations. Even so, passing that critical milestone will be the beginning of the story, not the end.

On a system with a physical error rate just below the threshold, QEC would require enormous redundancy to push the logical rate down very far. It becomes much less challenging with a physical rate further below the threshold. So just crossing the error threshold is not sufficientwe need to beat it by a wide margin. How can that be done?

If we take a step back, we can see that the challenge of dealing with errors in quantum computers is one of stabilizing a dynamic system against external disturbances. Although the mathematical rules differ for the quantum system, this is a familiar problem in the discipline of control engineering. And just as control theory can help engineers build robots capable of righting themselves when they stumble, quantum-control engineering can suggest the best ways to implement abstract QEC codes on real physical hardware. Quantum control can minimize the effects of noise and make QEC practical.

In essence, quantum control involves optimizing how you implement all the physical processes used in QECfrom individual logic operations to the way measurements are performed. For example, in a system based on superconducting qubits, a qubit is flipped by irradiating it with a microwave pulse. One approach uses a simple type of pulse to move the qubits state from one pole of the Bloch sphere, along the Greenwich meridian, to precisely the other pole. Errors arise if the pulse is distorted by noise. It turns out that a more complicated pulse, one that takes the qubit on a well-chosen meandering route from pole to pole, can result in less error in the qubits final state under the same noise conditions, even when the new pulse is imperfectly implemented.

One facet of quantum-control engineering involves careful analysis and design of the best pulses for such tasks in a particular imperfect instance of a given system. It is a form of open-loop (measurement-free) control, which complements the closed-loop feedback control used in QEC.

This kind of open-loop control can also change the statistics of the physical-layer errors to better comport with the assumptions of QEC. For example, QEC performance is limited by the worst-case error within a logical block, and individual devices can vary a lot. Reducing that variability is very beneficial. In an experiment our team performed using IBMs publicly accessible machines, we showed that careful pulse optimization reduced the difference between the best-case and worst-case error in a small group of qubits by more than a factor of 10.

Some error processes arise only while carrying out complex algorithms. For instance, crosstalk errors occur on qubits only when their neighbors are being manipulated. Our team has shown that embedding quantum-control techniques into an algorithm can improve its overall success by orders of magnitude. This technique makes QEC protocols much more likely to correctly identify an error in a physical qubit.

For 25 years, QEC researchers have largely focused on mathematical strategies for encoding qubits and efficiently detecting errors in the encoded sets. Only recently have investigators begun to address the thorny question of how best to implement the full QEC feedback loop in real hardware. And while many areas of QEC technology are ripe for improvement, there is also growing awareness in the community that radical new approaches might be possible by marrying QEC and control theory. One way or another, this approach will turn quantum computing into a realityand you can carve that in stone.

This article appears in the July 2022 print issue as Quantum Error Correction at the Threshold.

From Your Site Articles

Related Articles Around the Web

Link:

Quantum Error Correction: Time to Make It Work - IEEE Spectrum

The Spooky Quantum Phenomenon You’ve Never Heard Of – Quanta Magazine

Perhaps the most famously weird feature of quantum mechanics is nonlocality: Measure one particle in an entangled pair whose partner is miles away, and the measurement seems to rip through the intervening space to instantaneously affect its partner. This spooky action at a distance (as Albert Einstein called it) has been the main focus of tests of quantum theory.

Nonlocality is spectacular. I mean, its like magic, said Adn Cabello, a physicist at the University of Seville in Spain.

But Cabello and others are interested in investigating a lesser-known but equally magical aspect of quantum mechanics: contextuality. Contextuality says that properties of particles, such as their position or polarization, exist only within the context of a measurement. Instead of thinking of particles properties as having fixed values, consider them more like words in language, whose meanings can change depending on the context: Timeflies likean arrow. Fruitflies likebananas.

Although contextuality has lived in nonlocalitys shadow for over 50 years, quantum physicists now consider it more of a hallmark feature of quantum systems than nonlocality is. A single particle, for instance, is a quantum system in which you cannot even think about nonlocality, since the particle is only in one location, said Brbara Amaral, a physicist at the University of So Paulo in Brazil. So [contextuality] is more general in some sense, and I think this is important to really understand the power of quantum systems and to go deeper into why quantum theory is the way it is.

Researchers have also found tantalizing links between contextuality and problems that quantum computers can efficiently solve that ordinary computers cannot; investigating these links could help guide researchers in developing new quantum computing approaches and algorithms.

And with renewed theoretical interest comes a renewed experimental effort to prove that our world is indeed contextual. In February, Cabello, in collaboration with Kihwan Kim at Tsinghua University in Beijing, China, published a paper in which they claimed to have performed the first loophole-free experimental test of contextuality.

The Northern Irish physicist John Stewart Bell is widely credited with showing that quantum systems can be nonlocal. By comparing the outcomes of measurements of two entangled particles, he showed with his eponymous theorem of 1965 that the high degree of correlations between the particles cant possibly be explained in terms of local hidden variables defining each ones separate properties. The information contained in the entangled pair must be shared nonlocally between the particles.

Bell also proved a similar theorem about contextuality. He and, separately, Simon Kochen and Ernst Specker showed that it is impossible for a quantum system to have hidden variables that define the values of all their properties in all possible contexts.

In Kochen and Speckers version of the proof, they considered a single particle with a quantum property called spin, which has both a magnitude and a direction. Measuring the spins magnitude along any direction always results in one of two outcomes: 1 or 0. The researchers then asked: Is it possible that the particle secretly knows what the result of every possible measurement will be before it is measured? In other words, could they assign a fixed value a hidden variable to all outcomes of all possible measurements at once?

Quantum theory says that the magnitudes of the spins along three perpendicular directions must obey the 101 rule: The outcomes of two of the measurements must be 1 and the other must be 0. Kochen and Specker used this rule to arrive at a contradiction. First, they assumed that each particle had a fixed, intrinsic value for each direction of spin. They then conducted a hypothetical spin measurement along some unique direction, assigning either 0 or 1 to the outcome. They then repeatedly rotated the direction of their hypothetical measurement and measured again, each time either freely assigning a value to the outcome or deducing what the value must be in order to satisfy the 101 rule together with directions they had previously considered.

They continued until, in the 117th direction, the contradiction cropped up. While they had previously assigned a value of 0 to the spin along this direction, the 101 rule was now dictating that the spin must be 1. The outcome of a measurement could not possibly return both 0 and 1. So the physicists concluded that there is no way a particle can have fixed hidden variables that remain the same regardless of context.

While the proof indicated that quantum theory demands contextuality, there was no way to actually demonstrate this through 117 simultaneous measurements of a single particle. Physicists have since devised more practical, experimentally implementable versions of the original Bell-Kochen-Specker theorem involving multiple entangled particles, where a particular measurement on one particle defines a context for the others.

In 2009, contextuality, a seemingly esoteric aspect of the underlying fabric of reality, got a direct application: One of the simplified versions of the original Bell-Kochen-Specker theorem was shown to be equivalent to a basic quantum computation.

The proof, named Mermins star after its originator, David Mermin, considered various combinations of contextual measurements that could be made on three entangled quantum bits, or qubits. The logic of how earlier measurements shape the outcomes of later measurements has become the basis for an approach called measurement-based quantum computing. The discovery suggested that contextuality might be key to why quantum computers can solve certain problems faster than classical computers an advantage that researchers have struggled mightily to understand.

Robert Raussendorf, a physicist at the University of British Columbia and a pioneer of measurement-based quantum computing, showed that contextuality is necessary for a quantum computer to beat a classical computer at some tasks, but he doesnt think its the whole story. Whether contextuality powers quantum computers is probably not exactly the right question to ask, he said. But we need to get there question by question. So we ask a question that we understand how to ask; we get an answer. We ask the next question.

Some researchers have suggested loopholes around Bell, Kochen and Speckers conclusion that the world is contextual. They argue that context-independent hidden variables havent been conclusively ruled out.

In February, Cabello and Kim announced that they had closed every plausible loophole by performing a loophole free Bell-Kochen-Specker experiment.

The experiment entailed measuring the spins of two entangled trapped ions in various directions, where the choice of measurement on one ion defined the context for the other ion. The physicists showed that, although making a measurement on one ion does not physically affect the other, it changes the context and hence the outcome of the second ions measurement.

Skeptics would ask: How can you be certain that the context created by the first measurement is what changed the second measurement outcome, rather than other conditions that might vary from experiment to experiment? Cabello and Kim closed this sharpness loophole by performing thousands of sets of measurements and showing that the outcomes dont change if the context doesnt. After ruling out this and other loopholes, they concluded that the only reasonable explanation for their results is contextuality.

Cabello and others think that these experiments could be used in the future to test the level of contextuality and hence, the power of quantum computing devices.

If you want to really understand how the world is working, said Cabello, you really need to go into the detail of quantum contextuality.

Link:

The Spooky Quantum Phenomenon You've Never Heard Of - Quanta Magazine

IonQ and GE Research Demonstrate High Potential of Quantum Computing for Risk Aggregation – Business Wire

COLLEGE PARK, Md.--(BUSINESS WIRE)--IonQ (NYSE: IONQ), an industry leader in quantum computing, today announced promising early results with its partner, GE Research, to explore the benefits of quantum computing for modeling multi-variable distributions in risk management.

Leveraging a Quantum Circuit Born Machine-based framework on standardized, historical indexes, IonQ and GE Research, the central innovation hub for the General Electric Company (NYSE: GE), were able to effectively train quantum circuits to learn correlations among three and four indexes. The prediction derived from the quantum framework outperformed those of classical modeling approaches in some cases, confirming that quantum copulas can potentially lead to smarter data-driven analysis and decision-making across commercial applications. A blog post further explaining the research methodology and results is available here.

Together with GE Research, IonQ is pushing the boundaries of what is currently possible to achieve with quantum computing, said Peter Chapman, CEO and President, IonQ. While classical techniques face inefficiencies when multiple variables have to be modeled together with high precision, our joint effort has identified a new training strategy that may optimize quantum computing results even as systems scale. Tested on our industry-leading IonQ Aria system, were excited to apply these new methodologies when tackling real world scenarios that were once deemed too complex to solve.

While classical techniques to form copulas using mathematical approximations are a great way to build multi-variate risk models, they face limitations when scaling. IonQ and GE Research successfully trained quantum copula models with up to four variables on IonQs trapped ion systems by using data from four representative stock indexes with easily accessible and variating market environments.

By studying the historical dependence structure among the returns of the four indexes during this timeframe, the research group trained its model to understand the underlying dynamics. Additionally, the newly presented methodology includes optimization techniques that potentially allow models to scale by mitigating local minima and vanishing gradient problems common in quantum machine learning practices. Such improvements demonstrate a promising way to perform multi-variable analysis faster and more accurately, which GE researchers hope lead to new and better ways to assess risk with major manufacturing processes such as product design, factory operations, and supply chain management.

As we have seen from recent global supply chain volatility, the world needs more effective methods and tools to manage risks where conditions can be so highly variable and interconnected to one another, said David Vernooy, a Senior Executive and Digital Technologies Leader at GE Research. The early results we achieved in the financial use case with IonQ show the high potential of quantum computing to better understand and reduce the risks associated with these types of highly variable scenarios.

Todays results follow IonQs recent announcement of the companys new IonQ Forte quantum computing system. The system features novel, cutting-edge optics technology that enables increased accuracy and further enhances IonQs industry leading system performance. Partnerships with the likes of GE Research and Hyundai Motors illustrate the growing interest in our industry-leading systems and feeds into the continued success seen in Q1 2022.

About IonQ

IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQ's current generation quantum computer, IonQ Forte, is the latest in a line of cutting-edge systems, including IonQ Aria, a system that boasts industry-leading 20 algorithmic qubits. Along with record performance, IonQ has defined what it believes is the best path forward to scale. IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.

IonQ Forward-Looking Statements

This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words anticipate, expect, suggests, plan, believe, intend, estimates, targets, projects, should, could, would, may, will, forecast and other similar expressions are intended to identify forward-looking statements. These statements include those related to IonQs ability to further develop and advance its quantum computers and achieve scale; IonQs ability to optimize quantum computing results even as systems scale; the expected launch of IonQ Forte for access by select developers, partners, and researchers in 2022 with broader customer access expected in 2023; IonQs market opportunity and anticipated growth; and the commercial benefits to customers of using quantum computing solutions. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: market adoption of quantum computing solutions and IonQs products, services and solutions; the ability of IonQ to protect its intellectual property; changes in the competitive industries in which IonQ operates; changes in laws and regulations affecting IonQs business; IonQs ability to implement its business plans, forecasts and other expectations, and identify and realize additional partnerships and opportunities; and the risk of downturns in the market and the technology industry including, but not limited to, as a result of the COVID-19 pandemic. The foregoing list of factors is not exhaustive. You should carefully consider the foregoing factors and the other risks and uncertainties described in the Risk Factors section of IonQs Quarterly Report on Form 10-Q for the quarter ended March 31, 2022 and other documents filed by IonQ from time to time with the Securities and Exchange Commission. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and IonQ assumes no obligation and does not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. IonQ does not give any assurance that it will achieve its expectations.

More:

IonQ and GE Research Demonstrate High Potential of Quantum Computing for Risk Aggregation - Business Wire

US Pursues Next-gen Exascale Systems with 5-10x the Performance of Frontier – HPCwire

With the Linpack exaflops milestone achieved by the Frontier supercomputer at Oak Ridge National Laboratory, the United States is turning its attention to the next crop of exascale machines, some 5-10x more performant than Frontier. At least one such system is being planned for the 2025-2030 timeline, and the DOE is soliciting input from the vendor community to inform the design and procurement process.

A request for information (RFI) was issued today by the Department of Energy, seeking feedback from computing hardware and software vendors, system integrators, and other entities to assist the DOE National Laboratories in planning for next-gen exascale systems. The RFI says responses will inform one or more DOE system acquisition RFPs, which will describe requirements for system deliveries in the 20252030 timeframe. This could include the successor to Frontier (aka OLCF-6), the successor to Aurora (aka ALCF-5), the successor to Crossroads (aka ATS-5), the successor to El Capitan (aka ATS-6) as well as a next-generation NERSC system (possibly NERSC-11). Note that of the predecessor systems, only Frontier has been installed so far.

Heres an excerpt from the RFI:

DOE is interested in the deployment of one or more supercomputers that can solve scientific problems 5 to 10 times faster or solve more complex problems, such as those with more physics or requirements for higher fidelity than the current state-of-the-art systems. These future systems will include associated networks and data hierarchies. A capable software stack will meet the requirements of a broad spectrum of applications and workloads, including large-scale computational science campaigns in modeling and simulation, machine intelligence, and integrated data analysis. We expect these systems to operate within a power envelope of 2060 MW. These systems must be sufficiently resilient to hardware and software failures, in order to minimize requirements for user intervention. As the technologies evolve, we anticipate increased attention to resilience in other supercomputing system developments.

While the RFI states a desired overall performance increase of 5-10x, the notice sharpens the estimate to 1020+ FP64 exaflops systems in the 2025+ timeframe and 100+ FP64 exaflops in the 2030+ timeframe, achieved through hardware and software acceleration mechanisms.

This is roughly 8 times more than 2022 systems in 2026 and 64 times more in 2030, the RFI states. For lower-precision AI, there is an expected multiple of at least 8 to 16 times the FP64 rates.

A section on mission need stresses the importance of data-driven modeling and simulation to the nations science, energy and security priorities. [T]he United States must continue to push strategic advancements in HPC bringing about a grand convergence of modeling and simulation, data analytics, deep learning, artificial intelligence (AI), quantum computing, and other emerging capabilities across integrated infrastructures in computational ecosystems, the RFI states.

As such, these systems are expected to solve emerging data science, artificial intelligence, edge deployments at facilities, and science ecosystem problems, in addition to the traditional modeling and simulation applications.

The ideal future system will also be more agile, modular and extensible.

We also wish to explore the development of an approach that moves away from monolithic acquisitions toward a model for enabling more rapid upgrade cycles of deployed systems, to enable faster innovation on hardware and software. One possible strategy would include increased reuse of existing infrastructure so that the upgrades are modular. A goal would be to reimagine systems architecture and an efficient acquisition process that allows continuous injection of technological advances to a facility (e.g., every 1224 months rather than every 45 years), asserts the RFI.

A key thrust of the DOE supercomputing strategy is the creation of an Advanced Computing Ecosystem (ACE) that enables integration with other DOE facilities, including light source, data, materials science, and advanced manufacturing.

The next generation of supercomputers will need to be capable of being integrated into an ACE environment that supports automated workflows, combining one or more of these facilities to reduce the time from experiment and observation to scientific insight, the document states.

The information collected in response to the RFI will support next-generation system planning and decision-making at Oak Ridge National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Sandia National Laboratory, and Argonne National Laboratory. The labs will use the information to update their advanced system roadmaps and to draft future RFPs (requests for proposals) for those systems.

The RFI published today has some similar hallmarks to the one issued by the Collaboration for Oak Ridge, Argonne and Livermore aka CORAL in 2012. However, a couple people I spoke with at Oak Ridge National Laboratory last week said they dont expect there to be another CORAL program, partly because the cadence is off on account of the rewritten Aurora contract. Delays and reconceptualizations moved that system from the CORAL-1 to CORAL-2 timeline.

The original CORAL contract called for three pre-exascale systems (~150-200 petaflops each) with at least two different architectures to manage risk. Only two systems Summit at Oak Ridge and Sierra at Livermore were completed in the intended timeframe, using nearly the same heterogeneous IBM-Nvidia architecture. CORAL-2 took a similar tack, calling for three exascale-class systems with at least two distinct architectures. Two of these systems Frontier and El Capitan are based on a very similar heterogenous HPE AMD+AMD architecture. The redefined Aurora which is based on the heterogenous HPE Intel+Intel architecture takes the place of the third system.

More here:

US Pursues Next-gen Exascale Systems with 5-10x the Performance of Frontier - HPCwire

Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST

An enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London.

Credit: Shutterstock/William Barton

Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.

The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turings brief life.

Alan Turing

Credit: National Portrait Gallery, London

In Turings time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a cars most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.

Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.

His hypothetical device is now known as a Turing machine. The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to todays hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.

Credit: NIST

The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current state of the machine. For example, if the machine needs to decide whether the tape contains the text string TC it can scan the tape in the forward direction while switching among the states previous letter was T and previous letter was not C. If while in state previous letter was T it reads a C, it goes to a state found it and halts. If it encounters the blank symbol at the end of the input, it goes to the state did not find it and halts. Nowadays we would recognize the set of instructions as the machines program.

It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.

To understand what can be computed it is helpful to identify what cannot be computed. Ina previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: My program has been running for a long time; is it stuck? This is called the Halting Problem, and students often wondered why we simply couldnt detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement this machine halts is true or false.

Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.

Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called chemical castration. He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and pardoned him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.

Turings computability work provided the foundation for modern complexity theory. This theory tries to answer the question Among those problems that can be solved by a computer, which ones can be solved efficiently? Here, efficiently means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.

For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.

Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the worlds experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turings machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.

But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of todays computing technology and even gave us hints on how to solve looming technological problems.

Read more from the original source:

Alan Turing's Everlasting Contributions to Computing, AI and Cryptography - NIST

The big money is here: The arms race to quantum computing – Haaretz

Theres a major controversy raging in the field of quantum computing. One side consists of experts and researchers who are skeptical of quantum computers ability to be beneficial in the foreseeable future, simply because the physical and technological challenges are too great. On the other side, if you ask the entrepreneurs and investors at firms banking on quantum computing, that hasnt been the issue for quite some time. From their standpoint, its only a matter of time and concerted effort until the major breakthrough and the real revolution in the field is achieved. And theyre prepared to gamble a lot of money on that.

For decades, most of the quantum research and development has been carried out by academic institutions and government research institutes, but in recent years, steps to make the transition from the academic lab to the industrial sector have increased. Researchers and scientists have been creating or joining companies developing quantum computing technology, and startups in the field have been cropping up at a dizzying pace. In 2021, $3.2 billion was invested in quantum firms around the world, according to The Quantum Insider compared to $900 million in 2020.

And in the first quarter of this year, about $700 million was invested a sum similar to the investments in the field between 2015 and 2019 combined. In addition to the surge in startup activity in the field, tech giants such as IBM, Amazon, Google and Microsoft have been investing major resources in the field and have been recruiting experts as well.

The quantum computing field was academic for a long time, and everything changed the moment that big money reached industry, said Ayal Itzkovitz, managing partner at the Pitango First fund, which has invested in several quantum companies in recent years. Everything is moving forward more quickly. If three years ago, we didnt know if it was altogether possible to build such a computer, now we already know that there will be quantum computers that will be able to do something different from classic computers.

Quantum computers, which are based on the principles of quantum theory, are aimed at providing vastly greater computing power than regular computers, with the capability to carry out a huge number of computations simultaneously. Theoretically it should take them seconds, minutes or hours to do what it would take todays regular supercomputers thousands of years to perform.

Quantum computers are based not on bits, but on qubits produced by a quantum processing unit, which is not limited to the binary of 0 or 1 but is a combination of the two. The idea is that a workable quantum computer, if and when there is such a thing, wont be suitable for use for any task but instead for a set of specific problems that require simultaneous computing, such as simulations, for example. It would be relevant for fields such as chemistry, pharmaceuticals, finance, energy and encoding among others.

It's still all theoretical, and there has yet to be a working quantum computer produced that is capable of performing a task more effectively than a regular computer but that doesnt bother those engaged in the arms race to develop a breakthrough quantum processor.

A million-qubit computer

IBM, which is one of the pioneers in the industry, recently unveiled a particularly large 127-qubit computer, and its promising to produce a 1,000-qubit one within the next few years. In 2019, Google claimed quantum supremacy with a computer that managed in 3.5 minutes to perform a task that would have taken a regular computer 10,000 years to carry out. And in May of last year, it unveiled a new quantum center in Santa Barbara, California and it intends to build a million-qubit computer by 2029 at an investment of billions of dollars.

Amazon has gotten into the field, recruiting researchers and recently launching a new quantum center at the California Institute of Technology, and Intel and Microsoft have also gotten into the game. In addition to their own internal development efforts, Amazon, Microsoft and Google have been offering researchers access to active quantum computers via their cloud computing services.

At the same time, there are several firms in the market that specialize in quantum computing that have already raised considerable sums or have even gone public. One of the most prominent of them is the American company IonQ (which in the past attracted investments from Google, Amazon and Samsung) and which last year went public via a SPAC merger. Another such company is the Silicon Valley firm Rigetti Computing, which also went public via a SPAC merger. Then theres Quantinuum, which was the product of a merger between Honeywell Quantum Solutions and Cambridge Quantum.

All thats in addition to a growing startup ecosystem of smaller companies such as Atom Computing and QuEra, which have raised initial funding to develop their own versions of a quantum processor.

In Israel in recent months, the countrys first two startups trying to create a quantum processor have been established. Theyre still in their stealth stage. One is Rehovot-based Quantum Source, which has raised $15 million to develop photonic quantum computing solutions. Its technology is based on research at the Weizmann Institute of Science, and its headed by leading people in the Israeli processor chip sector. The second is Quantum Art, whose executives came from the Israeli defense sector. Its technology is also based on work at the Weizmann Institute.

There are also other early-stage enterprises that are seeking to develop a quantum processor, including one created by former Intel employees and another by former defense company people. Then there is LightSolver, which is seeking to develop a laser technology computer, which is not quantum technology, but it seeks to provide similar performance.

Going for broke

But all of these are at their early stages from a technological standpoint, and the prominent companies overseas have or are building active but small quantum computers usually of dozens of qubits that are only for R&D use to demonstrate their capabilities but without actual practical application. Thats out of a sense that developing an effective quantum computer that has a real advantage requires millions of qubits. Thats a major disparity that will be difficult to bridge from a technological standpoint.

The problem is that sometimes investing in the here-and-now comes at the expense of investments in the future. The quantum companies are still relatively small and have limited staff. If they have an active computer, they also need to maintain it and support its users in the community and among researchers. That requires major efforts and a lot of money, which might be at the expense of next-generation research and it is already delaying the work of a large number of quantum computer manufacturers who are seeing how smaller startups focusing only on next-generation development are getting ahead of them.

As a result, there are also companies with an entirely different approach, which seeks to skip over the current generation of quantum computers and go for broke to build an effective computer with millions of qubits capable of error detection and correction even if it takes many years.

In 2016, it was on that basis that the Palo Alto, California firm PsiQuantum was founded. Last year the company raised $450 million (in part from Microsoft and BlackRock) based on a company valuation of $3 billion, becoming one of the hot and promising names in the field.

Itzkovitz, from the Pitango fund, was one of its early investors. They said they wouldnt make a small computer with a few qubits because it would delay them but would instead go straight for the real goal, he explained.

PsiQuantum is gambling on a fundamentally different paradigm: Most of the companies building an active computer, including the tech giants, have chosen technology based on specifical material matters (for example superconductors or trapped ions). In contrast, PsiQuantum is building a photonic quantum computer, based on light and optics an approach that until recently was considered physically impossible.

Itzkovitz said that he has encountered a large number of startups that are building quantum processors despite the technological risk and the huge difficulty involved. In the past two weeks, I have spoken with 12 or 13 companies making qubits from England, Holland, Finland, the United States and Canada as if this were the most popular thing there was now in the high-tech industry around the world, he said.

As a result, there are also venture capital funds in Israel and overseas that in the past had not entered the field but that are now looking for such companies to invest in over concern not to be left out of the race, as well as a desire to be exposed to the quantum field.

Its the Holy Grail

Similar to the regular computing industry, in quantum computing, its also not enough to build a processor. A quantum processor is a highly complex system that requires a collection of additional hardware components, as well as software and supporting algorithms, of course all of which are designed to permit its core to function efficiently and to take advantage of the ability and potential of qubits in the real world. Therefore, at the same time that quantum processor manufacturers have been at work, in recent years there has been a growing industry of startups seeking to provide them and clients with layers of hardware and software in the tower that stands on the shoulders of the quantum computers processor.

A good example of that is the Israeli firm Quantum Machines, which was established in 2018 and has so far raised $75 million. It has developed a monitoring and control system for quantum computers consisting of hardware and software. According to the company, the system constitutes the brain of the quantum processor and enables it to perform computing activity well and to fulfill its potential. There are also other companies in the market supplying such components and other components including even the refrigerators necessary to build the computers.

Some companies develop software and algorithms in the hope that they will be needed to effectively operate the computers. One of them is Qedma Quantum Computing from Israel, which has developed what it describes as an operating system for quantum computers that is designed to reduce errors and increase quantum computers reliability.

Our goal is to provide hardware manufacturers with the tools that will enable them to do something efficient with the quantum computers and to help create a world in which quantum algorithmic advantages can actually be realized, said Asif Sinay, the companys founder-partner and CEO. Its the Holy Grail of all of the quantum companies in the world.

The big challenge facing these companies is proving that their technology is genuine and that it provides real value to companies developing quantum processors. Thats of course in addition to providing a solution that is sufficiently unique that the tech giants wont be able to develop it on their own.

The big companies dont throw money around just like that, Sinay said. They want to create cooperation with companies that help them reach their goal and to improve the quality of the quantum computer. Unlike the cyber field, for example, you cant come and scare a customer into buying your product. Here youre sitting with people at your level, really smart [people] who understand that you need to give them value that assists in the companys performance and to take the computer to a higher level.

Two concurrent arms races

What the companies mentioned so far have in common is that they are building technology designed to create an efficient quantum computer, whether its a processor or the technology surrounding it. At the same time, another type of companies is gaining steam those that develop the tools to develop quantum software that in the future will make it possible for developers and firms to build applications for the quantum computer.

Classiq is an Israeli company that has developed tools that make it easier for programmers to write software for quantum computers. It raised $33 million at the beginning of the year and has raised $48 million all told. A competitor in Singapore, Horizon Quantum Computing, which just days ago announced that it raised $12 million, is offering a similar solution.

Another prominent player is the U.S. firm Zapata, in which Israels Pitago fund has also invested, and which is engaged in services involved in building quantum applications for corporations.

There are two concurrent arms races happening now, says Nir Minerbi, co founder and CEO of Classiq. One is to build the worlds first fully functional quantum computer. And many startups and tech giants are working on that and that market is now peaking. The second race is the one for creating applications and software that runs on quantum and can serve these firms. This is a field that is now only making its first steps - and its hard to know when it will reach its goal.

Originally posted here:

The big money is here: The arms race to quantum computing - Haaretz

Memristors: Quantum computing breakthrough could take us back to the multiverse – RedShark News

It could be right out of Back to the Future but a device known as a quantum memristor has been invented to open up the possibility of building a brainlike supercomputer. Lets call it Orac, Blakes 7 fans.

Detailing the creation of the first prototype of such a device in the journal Nature Photonics, Experimental photonic quantum memristor | Nature Photonics, scientists say the breakthrough could help combine quantum computing with artificial intelligence and the development of quantum neuromorphic computers.

A memristor or memory resistor is describedas a kind of building block for electronic circuits that scientists predicted roughly 50 years ago but created for the first time only a little more than a decade ago.

These components are essentially electric switches that can remember whether they were toggled on or off after their power is turned off. As such, they resemble synapsesthe links between neurons in the human brainwhose electrical conductivity strengthens or weakens depending on how much electrical charge has passed through them in the past.

In theory, memristors can act like artificial neurons capable ofboth computing and storing data. As such, researchers have suggested thatneuromorphiccomputer would perform well at running neural networks, which are machine-learning systems that use synthetic versions of synapses and neurons to mimic the process of learning in the human brain.

Using computer simulations, the researchers suggest quantum memristors could lead to an exponential growth in performance in a machine-learning approach known asreservoir computingthat excels at learning quickly.

Potentially, quantum reservoir computing may have aquantum advantageover classical reservoir computing, says study lead author Michele Spagnolo, a doctoral student in quantum physics at the University of Vienna.

The advantage of using a quantum memristor in quantum machine learning is the fact that the memristor, unlike any other quantum component, has memory, he adds.

Among the more profound benefits that quantum computers could be used for is to simulate quantum physical processes for much faster drug and material design; to accelerate AI development and to provide new levels of security and information communication. But they could also be used to break public-key encryptions, to amplify current AI risks at a faster pace, or be misused in biotechnology to design bio-weapons or other risks.

We now live in a Wright brothers moment in the history of quantum computing,Ibrahim Almosallam, a consultant for the Saudi Information Technology Company, writes atWorld Economic Review. When a commercial jet version arrives, it will deliver a new leap in information technology similar to what classical computation delivered in the 20th century, and, just like with any general-purpose technology such as the internet, electricity, and, for that matter, fire alongside great benefits, comes great risks.

Then theres more prosaic stuff like a super-AI creating the latest Pixar feature. This is where quantum can turbo-charge machine learning, improving the ability of AI to derive useful information from photos and videos, according to a recent report in the Harvard Business Review Quantum Computing for Business Leaders (hbr.org).

However, building and scaling a stable quantum computer is not easy. Photons and electrons are delicate; their behaviour defies our ingrained view of how the physical world operates, saysHBR.

One of the most formidable obstacles to building functional quantum computers is that qubits dont stick around very long, the article elaborates. Vibration, temperature, and other environmental factors can cause them to lose their quantum-mechanical properties, resulting in errors. Today, the rate at which errors occur in qubits limits the duration of algorithms that can be run.

Scientists are working to build environments in which many physical qubits act together to create error-protected logical qubits, which can survive for much longer periods of time long enough to support commercially viable applications.

Still, the most advanced quantum computers today have 50 to 100 physical qubits; it will most likely need ten times that to make a single error-protected logical qubit.

It is the state of flux (known assuperpositions) in which photons exist which causes the inherent instability of quantum systems. Superposition means they can essentially be located in two or more places at once (or spin in two opposite directions at the same time).

The breakthrough quantum memristor in the new study, as outlined by IEEE Spectrum, is a technique that relies on a stream of photons existing in superpositions where each single photon can travel down two separate paths laser-written onto glass. One of the channels in this single-qubit integrated photonic circuit is used to measure the flow of these photons, and this data, through a complex electronic feedback scheme, controls the transmissions on the other path, resulting in the device behaving like a memristor.

In other words, while memristive behavior and quantum effects are not expected to coexist, the researchers appear to have overcame this apparent contradiction by engineering interactions within their device to be strong enough to enable memristivity but weak enough to preserve quantum behaviour.

Taking another leap into the theoretical, this could also have implications for our understanding of what it means to live in the multiverse.

Stay with me here. Yes, the multiverse is currently in vogue among storytellers as a means to spin more canon fodder out of tired IP franchises. Looking at you directly Marvel and your upcomingDoctor Strange in the Multiverse of Madness. Even season 2 of Netflix comedy Russian Doll loops its protagonists back to 1982 and riffs on Back to the Future.

The multiverse as depicted in the movies, is a world full of endless potential; multiple parallel universes spinning in synchronicity; and the possibility of alternate, powerful, seemingly better versions of ourselves.

At Vox, a mathematical physicist at the California Institute of Technology, says this is possible in theory.

Spyridon Michalakis is no random boffin Im the science consultant forAnt-Manand I introduced the quantum realm [to Marvel], he explains.

Having established his credentials, Michalakis then explains that basically the multiverse is grounded in quantum mechanics.

Space and time are one single, singular construct, he explains in a 101 of Einsteins theory. Theres not like you have space and then time; itsspace X time.Moreover, quantum space time is a superposition: a quantum superposition of an infinite number of space times, all happening at the same time.

That word again: superposition.

This illusion basic physical reality is the fact that human beings have very specific points of view, ways of observing the superposition.

He makes this startling observation by mixing science with a cinematic metaphor.

The frame rate of the human mind is so low relative to the frame rate of the universe, he says. Lets say we only perceive 100 frames per second. We can be aware of our lives and choices we make, but then the frame rate of the universe (where you could be flicking between different timelines) is 40 orders of magnitude above that.

Were all trying to figure out the plot of the universe by just watching the beginning and the end of the movie, the first and last frame. Were just reconstructing the in-between the best we can. Thats where the multiverse hides; it hides there in between frames. Honestly, I think that the frame rate of the universe truly is infinite, not even finite, very, very large. And were so far away from that.

So that means were stuck in observing just one reality, not the multiplicity of them but we could if only we had a brain the size of a planet.

If only we could build one

See the original post here:

Memristors: Quantum computing breakthrough could take us back to the multiverse - RedShark News

Global Quantum Computing Market Assessment 2022-2027: Growing Adoption in Aerospace and Defense, Growing investment of Governments, & Emergence of…

DUBLIN, April 27, 2022 /PRNewswire/ -- The "Quantum Computing Market by Technology, Infrastructure, Services, and Industry Verticals 2022 - 2027" report has been added to ResearchAndMarkets.com's offering.

Research and Markets Logo

This report assesses the technology, companies/organizations, R&D efforts, and potential solutions facilitated by quantum computing.

The report provides global and regional forecasts as well as the outlook for quantum computing impact on infrastructure including hardware, software, applications, and services from 2022 to 2027. This includes the quantum computing market across major industry verticals.

Quantum Computing Industry Impact

The implications for data processing, communications, digital commerce and security, and the internet as a whole cannot be overstated as quantum computing is poised to radically transform the ICT sector. In addition, quantum computing will disrupt entire industries ranging from government and defense to logistics and manufacturing. No industry vertical will be immune to the potential impact of quantum computing. Every industry must pay great attention to technology developments, implementation, integration, and market impacts.

Quantum Computing Technology Development

While there is great promise for quantum computing, it remains largely in the research and development (R&D) stage as companies, universities, and research organizations seek to solve some of the practical problems for commercialization such as how to keep a qubit stable. The stability problem is due to molecules always being in motion, even if that motion is merely a small vibration. When qubits are disturbed, a condition referred to as decoherence occurs, rendering computing results unpredictable or even useless. One of the potential solutions is to use super-cooling methods such as cryogenics.

Some say there is a need to reach absolute zero (the temperature at which all molecular motion ceases), but that is a theoretical temperature that is practically impossible to reach and maintain, requiring enormous amounts of energy. There are some room-temperature quantum computers in R&D using photonic qubits, but nothing is yet scalable. Some experts say that if the qubit energy level is high enough, cryogenic type cooling is not a requirement.

Alternatives include ion trap quantum computing and other methods to achieve very cold super-cooled small-scale demonstration level computing platforms. There are additional issues involved with implementing and operating quantum computing. In terms of maintenance, quantum systems must be kept at subzero temperatures to keep the qubits stable, which creates trouble for people working with them and expensive, energy-consuming equipment to support.

Story continues

Once these issues are overcome, we anticipate that quantum computing will become more mainstream for solving specific types of problems. However, there will remain general-purpose computing problems that must be solved with classical computing. In fact, we anticipate development of solutions that involve quantum and classical CPUs on the same computing platform, which will be capable of solving combined general purpose and use case-specific computation problems.

These next-generation computing systems will provide the best of both worlds, which will be high-speed, general-purpose computing combined with use case-specific ultra-performance for certain tasks that will remain outside the range of binary computation for the foreseeable future.

Select Report Findings:

The global market for QC hardware will exceed $8.3 billion by 2027

Leading application areas are simulation, optimization, and sampling

Managed services will reach $298 million by 2027 with CAGR of 43.9%

Key professional services will be deployment, maintenance, and consulting

QC based on superconducting (cooling) loops tech will reach $3.7B by 2027

Fastest growing industry verticals will be government, energy, and transportation

Key Topics Covered:

1.0 Executive Summary

2.0 Introduction2.1 Understanding Quantum Computing2.2 Quantum Computer Types2.2.1 Quantum Annealer2.2.2 Analog Quantum2.2.3 Universal Quantum2.3 Quantum Computing vs. Classical Computing2.3.1 Will Quantum replace Classical Computing?2.3.2 Physical Qubits vs. Logical Qubits2.4 Quantum Computing Development Timeline2.5 Quantum Computing Market Factors2.6 Quantum Computing Development Progress2.6.1 Increasing the Number of Qubits2.6.2 Developing New Types of Qubits2.7 Quantum Computing Patent Analysis2.8 Quantum Computing Regulatory Analysis2.9 Quantum Computing Disruption and Company Readiness

3.0 Technology and Market Analysis3.1 Quantum Computing State of the Industry3.2 Quantum Computing Technology Stack3.3 Quantum Computing and Artificial Intelligence3.4 Quantum Neurons3.5 Quantum Computing and Big Data3.6 Linear Optical Quantum Computing3.7 Quantum Computing Business Model3.8 Quantum Software Platform3.9 Application Areas3.10 Emerging Revenue Sectors3.11 Quantum Computing Investment Analysis3.12 Quantum Computing Initiatives by Country

4.0 Quantum Computing Drivers and Challenges4.1 Quantum Computing Market Dynamics4.2 Quantum Computing Market Drivers4.2.1 Growing Adoption in Aerospace and Defense Sectors4.2.2 Growing investment of Governments4.2.3 Emergence of Advance Applications4.3 Quantum Computing Market Challenges

5.0 Quantum Computing Use Cases5.1 Quantum Computing in Pharmaceuticals5.2 Applying Quantum Technology to Financial Problems5.3 Accelerate Autonomous Vehicles with Quantum AI5.4 Car Manufacturers using Quantum Computing5.5 Accelerating Advanced Computing for NASA Missions

6.0 Quantum Computing Value Chain Analysis6.1 Quantum Computing Value Chain Structure6.2 Quantum Computing Competitive Analysis6.2.1 Leading Vendor Efforts6.2.2 Start-up Companies6.2.3 Government Initiatives6.2.4 University Initiatives6.2.5 Venture Capital Investments6.3 Large Scale Computing Systems

7.0 Company Analysis7.1 D-Wave Systems Inc.7.2 Google Inc.7.3 Microsoft Corporation7.4 IBM Corporation7.5 Intel Corporation7.6 Nokia Corporation7.7 Toshiba Corporation7.8 Raytheon Company7.9 Other Companies7.9.1 1QB Information Technologies Inc.7.9.2 Cambridge Quantum Computing Ltd.7.9.3 QC Ware Corp.7.9.4 MagiQ Technologies Inc.7.9.5 Rigetti Computing7.9.6 Anyon Systems Inc.7.9.7 Quantum Circuits Inc.7.9.8 Hewlett Packard Enterprise7.9.9 Fujitsu Ltd.7.9.10 NEC Corporation7.9.11 SK Telecom7.9.12 Lockheed Martin Corporation7.9.13 NTT Docomo Inc.7.9.14 Alibaba Group Holding Limited7.9.15 Booz Allen Hamilton Inc.7.9.16 Airbus Group7.9.17 Amgen Inc.7.9.18 Biogen Inc.7.9.19 BT Group7.9.20 Mitsubishi Electric Corp.7.9.21 Volkswagen AG7.9.22 KPN7.10 Ecosystem Contributors7.10.1 Agilent Technologies7.10.2 Artiste-qb.net7.10.3 Avago Technologies7.10.4 Ciena Corporation7.10.5 Eagle Power Technologies Inc7.10.6 Emcore Corporation7.10.7 Enablence Technologies7.10.8 Entanglement Partners7.10.9 Fathom Computing7.10.10 Alpine Quantum Technologies GmbH7.10.11 Atom Computing7.10.12 Black Brane Systems7.10.13 Delft Circuits7.10.14 EeroQ7.10.15 Everettian Technologies7.10.16 EvolutionQ7.10.17 H-Bar Consultants7.10.18 Horizon Quantum Computing7.10.19 ID Quantique7.10.20 InfiniQuant7.10.21 IonQ7.10.22 ISARA7.10.23 KETS Quantum Security7.10.24 Magiq7.10.25 MDR Corporation7.10.26 Nordic Quantum Computing Group7.10.27 Oxford Quantum Circuits7.10.28 Post-Quantum (PQ Solutions)7.10.29 ProteinQure7.10.30 PsiQuantum7.10.31 Q&I7.10.32 Qasky7.10.33 QbitLogic7.10.34 Q-Ctrl7.10.35 Qilimanjaro Quantum Hub7.10.36 Qindom7.10.37 Qnami7.10.38 QSpice Labs7.10.39 Qu & Co7.10.40 Quandela7.10.41 Quantika7.10.42 Quantum Benchmark Inc.7.10.43 Quantum Circuits Inc.7.10.44 Quantum Factory GmbH7.10.45 QuantumCTek7.10.46 Quantum Motion Technologies7.10.47 QuantumX7.10.48 Qubitekk7.10.49 Qubitera LLC7.10.50 Quintessence Labs7.10.51 Qulab7.10.52 Qunnect7.10.53 QuNu Labs7.10.54 River Lane Research7.10.55 SeeQC7.10.56 Silicon Quantum Computing7.10.57 Sparrow Quantum7.10.58 Strangeworks7.10.59 Tokyo Quantum Computing7.10.60 TundraSystems Global Ltd.7.10.61 Turing7.10.62 Xanadu7.10.63 Zapata Computing7.10.64 Accenture7.10.65 Atos Quantum7.10.66 Baidu7.10.67 Northrop Grumman7.10.68 Quantum Computing Inc.7.10.69 Keysight Technologies7.10.70 Nano-Meta Technologies7.10.71 Optalysys Ltd.

8.0 Quantum Computing Market Analysis and Forecasts 2022 - 20278.1.1 Quantum Computing Market by Infrastructure8.1.2 Quantum Computing Market by Technology Segment8.1.3 Quantum Computing Market by Industry Vertical8.1.4 Quantum Computing Market by Region

9.0 Conclusions and Recommendations

10.0 Appendix: Quantum Computing and Classical HPC

For more information about this report visit https://www.researchandmarkets.com/r/6yf53

Media Contact:

Research and MarketsLaura Wood, Senior Managerpress@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470For U.S./CAN Toll Free Call +1-800-526-8630For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1904Fax (outside U.S.): +353-1-481-1716

Cision

View original content:https://www.prnewswire.com/news-releases/global-quantum-computing-market-assessment-2022-2027-growing-adoption-in-aerospace-and-defense-growing-investment-of-governments--emergence-of-advance-applications-301534128.html

SOURCE Research and Markets

Excerpt from:

Global Quantum Computing Market Assessment 2022-2027: Growing Adoption in Aerospace and Defense, Growing investment of Governments, & Emergence of...

These ultra-pure diamonds could be the key to unleashing the power of quantum computing – TechRadar

By working together with academic researchers, a Japanese jewelry firm has developed a new production method to create 2-inch diamond wafers that could soon be used in quantum computers.

Adamant Namiki Precision Jewelry collaborated with Saga University in Kyushu to create its new Kenzan Diamonds which are pure enough to be used in quantum computing. While diamond wafers with the required purity do exist, up until now they were too small (no larger than a 4mm square) to be used in quantum computing applications

According to a press release put out by Adamant Namiki, previous attempts to grow 2-inch diamond wafers failed due to the fact that they had higher levels of nitrogen impurities. Fortunately, the Japanese jewelry firm has developed a new technique that makes it possible to grow large diamond wafers with less impurities.

Instead of using diamond micro-needle seeding, Adamant Namiki and Saga Universitys new technique grows diamond wafers on a sapphire substrate coated with an iridium film using the principle of step flow growth. The substrates used by this new technique along with the stepped structure allow diamonds to be grown using high temperatures and pressure without any stress cracks during cool-down while also minimizing the absorption of nitrogen.

While traditional computers use processors made from silicon chips, researchers have begun experimenting with diamonds as a substitute for silicon as they are the hardest material on Earth and also a good conductor of heat.

In this case though, Adamant Namikis Kenzan Diamonds could be used for quantum storage applications due to their size and low-nitrogen nature. By using one of the firms new diamond wafers for quantum storage, up to a billion Blu-Ray discs worth of data could be stored on an incredibly small 2-inch form factor.

Although Adamant Namiki has announced its plans to make its Kenzan Diamond wafers commercially available next year, the firm has already begun working on developing 4-inch diamond wafers that could hold even more data.

At a time when organizations are returning to tape storage to help fend off ransomware attacks, its interesting to see new materials like diamonds being considered for the storage needs of the future.

Via Tom's Hardware

Originally posted here:

These ultra-pure diamonds could be the key to unleashing the power of quantum computing - TechRadar

Earth Day 2022: Quantum Computing has the Key to Protect Environment! – Analytics Insight

Can quantum computing hold the ultimate power to meet sustainable development?

Quantum computing has started gaining popularity with the integration of quantum mechanics through smart quantum computers. Yes, it can transform conventional computers with a highly complex nature. Meanwhile, quantum computing is ready to have the key to protecting the environment with technology. Lets celebrate Earth Day 2022 with sustainable development through quantum computing. Quantum computers hold the substantial potential to save the environment with technology and physics law. Thus, lets dig deeper into quantum computing to look out for ways how it holds the key to protecting the environment.

Earth Day 2022 is celebrated across the world to raise the awareness of environmental issues to human beings. It helps to come up with ideas to reduce the carbon footprint and energy consumption for effective sustainable development. Hence, quantum computing is determined to be the protector of the environment with technology to look out for sustainable development efficiently and effectively.

Quantum computers are a form of supercomputers with thousands of GPU and CPU cores with multiple high degrees of complex issues. It is used for performing multiple quantum calculations with Qubits for simulating the problems that human beings or classical computers cannot solve within a short period of time.

Now in the 21st century with the advancements in technologies, quantum computing can power sustainable development with smart functionalities. Quantum computers can protect the environment with technology by capturing carbon as well as fighting climate change for global warming.

Quantum computing can simulate large complicated molecules which can discover new catalysts for capturing sufficient carbon from the current environment. The room-temperature superconductors hold the key to decreasing the 10% of energy production that is lost in transmission. It will help in better processes to feed the increasing population as well as efficient batteries.

Quantum computing is set to address global challenges, raise awareness, generate solutions, and meet the sustainable development goals on Earth Day 2022. Quantum computers are transforming the illusion into reality with better climate models to protect the environment with technology. It is ready to provide sufficient in-depth insights into how the ways and activities of human beings are drastically affecting the environment and creating a barrier to sustainable development.

Multiple 200 Qubits quantum computers can help to find a catalyst to utilize the 3-5% of the worlds gas production as well as 1-2% of annual energy levels through multiple different tasks. It can be used to generate different catalysts for capturing carbon footprint from the air and decreasing carbon emissions by 80%-90%. Thus, quantum computing can control the rapid rise in temperature in the environment with technology.

That being said, lets celebrate Earth Day 2022 with quantum computing helping the world in ensuring carbon dioxide recycling and reducing harmful emissions of carbon monoxide.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Read more:

Earth Day 2022: Quantum Computing has the Key to Protect Environment! - Analytics Insight

Keysight and Singapores Quantum Engineering Programme to Accelerate Research, Development and Education in Quantum Technologies – Yahoo Finance

Joint effort will establish quantum innovation accelerator in Singapore

SANTA ROSA, Calif., April 27, 2022--(BUSINESS WIRE)--Keysight Technologies, Inc. (NYSE: KEYS), a leading provider of advanced design and validation solutions, and Singapores Quantum Engineering Programme (QEP) have signed a Memorandum of Understanding (MOU) to collaborate in accelerating research, development and education in quantum technologies.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220427005691/en/

National University of Singapore (NUS), Quantum Engineering Programme (QEP) and Keysight MOU signing ceremony. From left to right; Dr. Chen Guan Yow, Vice President and Head (New Businesses), Economic Development Board; Mr. Quek Gim Pew, Co-chair, QEP Steering Committee & Senior R&D Consultant for Ministry of Defence; Professor Chen Tsuhan, Deputy President (Research and Technology), NUS; Mr. Oh Sang Ho, Director of Keysight South Asia Pacific Regional Sales; Mr. Gooi Soon Chai, President of Keysight Order Fulfilment and Digital Operations & Keysight Senior Vice President; and Mr. Tan Boon Juan, Vice President & General Manager of Keysight General Electronics Measurement Solutions. (Photo: Business Wire)

The QEP was launched in 2018 by the National Research Foundation, Singapore (NRF) and hosted at the National University of Singapore (NUS), with the aim of supporting quantum technologies research and ecosystem building. The programme funds projects in quantum computing, quantum communication and security, quantum sensing, as well as a quantum foundry, that are expected to lead to practical uses.

Keysight is well positioned to provide modular and scalable quantum control systems, by leveraging the companys expertise in advanced measurement equipment, qubit control solutions and precise measurement instrumentation, which enable researchers to engineer and perhaps scale next-generation systems to harness the power of quantum computing and other quantum devices.

Story continues

"Its going to take a team effort to deliver on the promise of quantum technologies, whether that is better computing performance or more secure communication. We are glad to have Keysight join the partners of the Quantum Engineering Programme to support this work in Singapore," said Alexander Ling, director of the QEP. He is also an associate professor in the NUS Department of Physics and Principal Investigator at the Centre for Quantum Technologies.

Under the MOU, QEP and Keysight will closely cooperate in the development of quantum instrument packages, as well as the technologies that enable quantum systems to be scalable and deployable. In addition, they will establish a programme named "Quantum Joint Innovation Accelerator" that makes it easy for researchers participating in QEP to access several of Keysights software design tools and advanced test and measurement equipment. Researchers can apply to evaluate Keysight measurement tools in their laboratories and access equipment hosted at Keysights premises in Singapore.

"We're pleased to support QEP with quantum test solutions based on our expertise in advanced measurement and quantum engineering technologies," said Sang Ho Oh, general director for South Asia-Pacific at Keysight Technologies. "As the quantum ecosystem continues to build, Keysight will contribute solutions that will enable the Singapore ecosystem to accelerate the research, development and education of quantum technologies."

"Keysight and QEP will establish a collaborative framework to accelerate research and development in the emerging quantum technology ecosystem," said BJ Tan, vice president and general manager of Keysights general electronics measurement solutions. "Having this leading research partnership upstream will open up new frontiers and developments, which will propel industry innovations for years to come."

About the Quantum Engineering Programme (QEP)

The Quantum Engineering Programme (QEP) in Singapore will apply quantum technologies for solving user-defined problems, by funding research and supporting ecosystem building. Its work is focused over four pillars: quantum sensing, quantum communication and security, quantum computing and the establishment of a National Quantum Fabless Foundry. The programme was launched in 2018 by the National Research Foundation, Singapore, and is hosted by the National University of Singapore (NUS). More information is available at qepsg.org.

About National University of Singapore (NUS)

The National University of Singapore (NUS) is Singapores flagship university, which offers a global approach to education, research and entrepreneurship, with a focus on Asian perspectives and expertise. We have 17 faculties across three campuses in Singapore, with more than 40,000 students from 100 countries enriching our vibrant and diverse campus community. We have also established our NUS Overseas Colleges programme in more than 15 cities around the world.

Our multidisciplinary and real-world approach to education, research and entrepreneurship enables us to work closely with industry, governments and academia to address crucial and complex issues relevant to Asia and the world. Researchers in our faculties, 30 university-level research institutes, research centres of excellence and corporate labs focus on themes that include energy; environmental and urban sustainability; treatment and prevention of diseases; active ageing; advanced materials; risk management and resilience of financial systems; Asian studies; and Smart Nation capabilities such as artificial intelligence, data science, operations research and cybersecurity.

For more information on NUS, please visit https://www.nus.edu.sg/

About Keysight Technologies

Keysight delivers advanced design and validation solutions that help accelerate innovation to connect and secure the world. Keysights dedication to speed and precision extends to software-driven insights and analytics that bring tomorrows technology products to market faster across the development lifecycle, in design simulation, prototype validation, automated software testing, manufacturing analysis, and network performance optimization and visibility in enterprise, service provider and cloud environments. Our customers span the worldwide communications and industrial ecosystems, aerospace and defense, automotive, energy, semiconductor and general electronics markets. Keysight generated revenues of $4.9B in fiscal year 2021. For more information about Keysight Technologies (NYSE: KEYS), visit us at http://www.keysight.com.

More information about Keysights involvement in the emerging technologies of quantum computing can be found at https://www.keysight.com/us/en/solutions/emerging-technologies/quantum-solutions.html.

Additional information about Keysight Technologies is available in the newsroom at https://www.keysight.com/go/news and on Facebook, LinkedIn, Twitter and YouTube.

View source version on businesswire.com: https://www.businesswire.com/news/home/20220427005691/en/

Contacts

QEP CONTACT:Jenny Hogan+65 65164302jenny.hogan@nus.edu.sg

KEYSIGHT TECHNOLOGIES CONTACTS:Geri Lynne LaCombe, Americas/Europe+1 303 662 4748geri_lacombe@keysight.com

Fusako Dohi, Asia+81 42 660-2162fusako_dohi@keysight.com

Read the rest here:

Keysight and Singapores Quantum Engineering Programme to Accelerate Research, Development and Education in Quantum Technologies - Yahoo Finance

Members of Netherland’s Delft Quantum Ecosystem Receive 550000 ($594K USD) in Two R&D Grants – Quantum Computing Report

Members of Netherlands Delft Quantum Ecosystem Receive 550,000 ($594K USD) in Two R&D Grants

The first grant was for an amount of 350,000 and was provided by the Province of South Holland. It was given to a research collaboration consisting of collaboration between Orange Quantum Systems, Delft Circuits, and Leiden Cryogenics which are researching the practical application of quantum technology. The second grant was in the amount of 200,000 and was provided to the ImpaQT initiative by Metropolitan Region Rotterdam The Hagueand the Province of South Holland. The ImpaQT initiative is working to provide a value chain consisting of componentsd and related services for organizations wishing to build their own quantum computer using components provides by the members of the ImpaQT initiative. Members of the ImpaQT consortium include QuantWare,Demcon,Qu&Co,Orange Quantum Systems,Qblox,andDelft Circuits.Additional information about these grants and the associated programs can be seen in a news release provided by Quantum Delft available here.

April 25, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Read more from the original source:

Members of Netherland's Delft Quantum Ecosystem Receive 550000 ($594K USD) in Two R&D Grants - Quantum Computing Report

Quantum Isnt Armageddon; But Your Horse Has Already Left the Barn – PaymentsJournal

It is true that adversaries are collecting our encrypted data today so they can decrypt it later. In essence anything sent using PKI (Public Key Infrastructure) today may very well be decrypted when quantum computing becomes available. Our recent report identifies the risk to account numbers and other long tail data (data that still has high value 5 years or more into the future). Data you send today using traditional PKI is the horse that left the barn.

But this article describes a scary scenario where an adversarys quantum computer hacks the US militarys communications and utilizes that advantage to sink the US Fleet but that is highly unlikely as long as government agencies follow orders. The US government specifies that AES-128 be used for secret (unclassified) information and AES-256 for top secret (classified) information. While AES-128 can be cracked using quantum computers, one estimate suggests that would take 6 months of computing time. That would be very expensive. Most estimates indicate that using AES-256 would take hundreds of years, but the military is already planning an even safer alternative it just isnt yet in production (that I am aware of):

Arthur Herman conducted two formidable studies on what a single, successful quantum computing attack would do to both our banking systems and a major cryptocurrency. A single attack on the banking system by a quantum computer would take down Fedwire and cause $2 trillion of damage in a very short period of time. A similar attack on a cryptocurrency like bitcoin would cause a 90 percent drop in price and would start a three-year recession in the United States. Both studies were backed up by econometric models using over 18,000 data points to predict these cascading failures.

Another disastrous effect could be that an attacker with a CRQC could take control of any systems that rely on standard PKI. So, by hacking communications, they would be able to disrupt data flows so that the attacker could take control of a device, crashing it into the ground or even using it against an enemy. Think of the number of autonomous vehicles that we are using both from a civilian and military standpoint. Any autonomous devices such as passenger cars, military drones, ships, planes, and robots could be hacked by a CRQC and shut down or controlled to perform activities not originally intended by the current users or owners.

Overview byTim Sloane,VP, Payments Innovation at Mercator Advisory Group

The rest is here:

Quantum Isnt Armageddon; But Your Horse Has Already Left the Barn - PaymentsJournal

Chip startups using light instead of wires gaining speed and investments – Reuters

April 26 (Reuters) - Computers using light rather than electric currents for processing, only years ago seen as research projects, are gaining traction and startups that have solved the engineering challenge of using photons in chips are getting big funding.

In the latest example, Ayar Labs, a startup developing this technology called silicon photonics, said on Tuesday it had raised $130 million from investors including chip giant Nvidia Corp (NVDA.O).

While the transistor-based silicon chip has increased computing power exponentially over past decades as transistors have reached the width of several atoms, shrinking them further is challenging. Not only is it hard to make something so miniscule, but as they get smaller, signals can bleed between them.

Register

So, Moore's law, which said every two years the density of the transistors on a chip would double and bring down costs, is slowing, pushing the industry to seek new solutions to handle increasingly heavy artificial intelligence computing needs.

According to data firm PitchBook, last year silicon photonics startups raised over $750 million, doubling from 2020. In 2016 that was about $18 million.

"A.I. is growing like crazy and taking over large parts of the data center," Ayar Labs CEO Charles Wuischpard told Reuters in an interview. "The data movement challenge and the energy consumption in that data movement is a big, big issue."

The challenge is that many large machine-learning algorithms can use hundreds or thousands of chips for computing, and there is a bottleneck on the speed of data transmission between chips or servers using current electrical methods.

Light has been used to transmit data through fiber-optic cables, including undersea cables, for decades, but bringing it to the chip level was hard as devices used for creating light or controlling it have not been as easy to shrink as transistors.

PitchBooks senior emerging technology analyst Brendan Burke expects silicon photonics to become common hardware in data centers by 2025 and estimates the market will reach $3 billion by then, similar to the market size of the A.I. graphic chips market in 2020.

Beyond connecting transistor chips, startups using silicon photonics for building quantum computers, supercomputers, and chips for self-driving vehicles are also raising big funds.

PsiQuantum raised about $665 million so far, although the promise of quantum computers changing the world is still years out.

Lightmatter, which builds processors using light to speed up AI workloads in the datacenter, raised a total of $113 million and will release its chips later this year and test with customers soon after.

Luminous Computing, a startup building an AI supercomputer using silicon photonics backed by Bill Gates, raised a total of $115 million.

It is not just the startups pushing this technology forward. Semiconductor manufacturers are also gearing up to use their silicon chip-making technology for photonics.

GlobalFoundries Head of Computing and Wired Infrastructure Amir Faintuch said collaboration with PsiQuantum, Ayar, and Lightmatter has helped build up a silicon photonics manufacturing platform for others to use. The platform was launched in March.

Peter Barrett, founder of venture capital firm Playground Global, an investor in Ayar Labs and PsiQuantum, believes in the long-term prospects for silicon photonics for speeding up computing, but says it is a long road ahead.

"What the Ayar Labs guys do so well ... is they solved the data interconnect problem for traditional high-performance (computing)," he said. "But it's going to be a while before we have pure digital photonic compute for non-quantum systems."

Register

Reporting by Jane Lanhee Lee; Editing by Stephen Coates

Our Standards: The Thomson Reuters Trust Principles.

More:

Chip startups using light instead of wires gaining speed and investments - Reuters

Neural’s best quantum computing and physics stories from 2021 – The Next Web

2021 will be remembered for a lot of things, but when its all said and done we think itll eventually get called the year quantum computing finally came into focus.

Thats not to say useful quantum computers have actually arrived yet. Theyre still somewhere between a couple years and a couple centuries away. Sorry for being so vague, but when youre dealing with quantum physics there arent yet many guarantees.

This is because physics is an incredibly complex and challenging field of study. And the difficulty gets cranked up exponentially when you start adding theoretical and quantum to the research.

Were talking about physics at the very edge of reason. Like, for example, imagining a quantum-powered artificial intelligence capable of taking on the Four Horseman of the Apocalypse.

That might sound pretty wacky, but this story explains why its not quite as out there as you might think.

But lets go even further. Lets go past the edge of reason and into the realm of the speculative science. Earlier this year we wondered what would happen if physicists could actually prove that reality as we know it isnt real.

Per that article:

Theoretically, if we could zoom in past the muons and leptons and keep going deeper and deeper, we could reach a point where all objects in the universe are indistinguishable from each other because, at the quantum level, everything that exists is just a sea of nearly-identical subparticulate entities.

This version of reality would render the concepts of space and time pointless. Time would only exist as a construct by which we give meaning to our own observations. And those observations would merely be the classical side-effects of existing in a quantum universe.

So, in the grand scheme of things, its possible that our reality is little more than a fleeting, purposeless arrangement of molecules. Everything that encompasses our entire universe may be nothing more than a brief hallucination caused by a quantum vibration.

Nothing makes you feel special like trying to conceive of yourself as a few seasoning particles in an infinite soup of gooey submolecules.

If having an existential quantum identity-crisis isnt your thing, we also covered a lot of cool stuff that doesnt require you to stop seeing yourself as an individual stack of materials.

Does anyone remember the time China said it had built a quantum computer a million times more powerful than Googles? We dont believe it. But thats the claim the researchersmade. You can read more about that here.

Oh, and that Google quantum system the Chinese researchers referenced? Yeah, it turns out it wasnt exactly the massive upgrade over classical supercomputers it was chalked up to be either.

But, of course, we forgive Google for its marketing faux pas. And thats because, hands down, the biggest story of the year for quantum computers was the time crystal breakthrough.

As we wrote at the time:

If Googles actually created time-crystals, it could accelerate the timeline for quantum computing breakthroughs from maybe never to maybe within a few decades.

At the far-fetched, super-optimistic end of things we could see the creation of a working warp drive in our lifetimes. Imagine taking a trip to Mars or the edge of our solar system, and being back home on Earth in time to catch the evening news.

And, even on the conservative end with more realistic expectations, its not hard to imagine quantum computing-based chemical and drug discovery leading to universally-effective cancer treatments.

Talk about a eureka moment!

But there were even bigger things in the world of quantum physics than just advancing computer technology.

Scientists from the University of Sussex determined that black holes emanate a specific kind of quantum pressure that could lend some credence to multiple universe theories.

Basically, we cant explain where the pressure comes from. Could this be blow back from white holes swallowing up energy and matter in a dark, doppelganger universe that exists parallel to our own? Nobody knows! You can read more here though.

Still there were even bigger philosophical questions in play over the course of 2021 when it came to interpreting physics research.

Are we incapable of finding evidence for God because were actually gods in our rights? That might sound like philosophy, but there are some pretty radical physics interpretations behind that assertion.

And, if we are gods, can we stop time? Turns out, whether were just squishy mortal meatbags or actual deities, we actually can!

Alright. If none of those stories impress you, weve saved this one for last. If being a god, inventing time crystals, or even stopping time doesnt float your boat, how about immortality? And not just regular boring immortality, butquantum immortality.

Its probably not probable, and adding the word quantum to something doesnt necessarily make it cooler, but anythings possible in an infinite universe. Plus, the underlying theories involving massive-scale entanglement are incredible read more here.

Seldom a day goes by where something incredible isnt happening in the world of physics research. But thats nothing compared to the magic weve yet to uncover out there in this fabulous universe we live in.

Luckily for you, Neural will be back in 2022 to help make sense of it all. Stick with us for the most compelling, wild, and deep reporting on the quantum world this side of the non-fiction realm.

The rest is here:

Neural's best quantum computing and physics stories from 2021 - The Next Web

Breaking Up Tech Is a Gift to China – The Wall Street Journal

Few issues unite both sides of the political divide more than anger at U.S. tech companies, whether for censorship of conservative viewpoints or for failing to counter misinformation online. In response to these concerns, legislation introduced in Congress would weaken the U.S. tech industry, ostensibly in the name of breaking up monopolies. Unfortunately, the various bills would hurt the U.S. and strengthen the hand of our greatest geopolitical rival, the Peoples Republic of China.

As of 2018, nine of the top 20 global technology firms by valuation were based in China. President Xi Jinping has stated his intention to spend $1.4 trillion by 2025 to surpass the U.S. in key technology areas, and the Chinese government aggressively subsidizes national champion firms. Beginning with the Made in China 2025 initiative, Beijing has made clear that it wont stop until it dominates technologies such as quantum computing, artificial intelligence, autonomous systems and more. Last month the National Counterintelligence and Security Center warned that these are technologies where the stakes are potentially greatest for U.S. economic and national security.

Follow this link:

Breaking Up Tech Is a Gift to China - The Wall Street Journal

10 technology trends that could prove to be real game-changers – Mint

Smarter algorithms and machine language: AI has been the driving force for most products, applications and even devices that we use today. On 22 November, Gartner predicted that the total revenue in the AI software market is expected to hit $62.5 billion in 2022, an increase of 21.3% from 2021. The AI software market is picking up speed, but its long-term trajectory will depend on enterprises advancing their AI maturity," said Alys Woodward, senior research director at Gartner.

AI deployment in 2022 will be in knowledge management, virtual assistants, autonomous vehicles, digital workplaces and crowdsourced data, Gartner said. In addition, companies like Google are developing newer language learning models like LaMDALanguage Model for Dialogue Applicationswhich, the company claims, can hold their own in natural conversations.

Faster networks with bigger bandwidth: 5G has been in the works for what seems like years now, but 2022 may finally be the year we see these next-generation networks rolling out. India has already approved trial spectrum for telcos such as Bharti Airtel and Reliance Jio. The 5G spectrum auctions are expected in the first half of next year. 5G networks will start rolling out to the public next year if all goes well. In short, 5G means lower latency, which is what users perceive as speed. The new networks will allow new use cases for enterprises, enable smart city implementations and more.

Intelligent cloud and edge computing: The new use cases with 5G networks are heavily dependent on 5G. For instance, in September, Airtel tested Indias first cloud-gaming session in a 5G environment at its Manesar facility. The companys chief technology officer, Randip Sekhon, said cloud gaming would be among the biggest use cases" for 5G networks. The dependency on the cloud will only increase among enterprises.

Moreover, edge computing is finally set to flourish. It is helping enterprises bring the data and computing requirements closer to the users device. This trend will help make products like driverless or autonomous vehicles more efficient.

More interconnected devices that talk to each other: Earlier this month, Airtel, Invest India and the National Investment Promotion and Facilitation Agency announced a Startup Innovation Challenge. The challenge asks early-stage startups to create new use cases involving IoT. As data flows faster and computing power comes from large server farms using the cloud, more devices can start connecting and working as one. A June report by Gartner said the IoT endpoint electronics and communications market will touch $21.3 billion in 2022, increasing its forecast by 22% against the 2021 predictions. This is driven by governments using IoT for surveillance, enterprises using connected devices for everything from banking to communication, and delivering new products.

Privacy gaining ground: After about two years of deliberation, the joint parliamentary committee (JPC) on the Data Protection Bill was finally able to table its report on the bill during the ongoing winter session of Parliament. The JPC recommended that India have one bill to regulate personal and non-personal data and stop companies from profiling childrens accounts and using targeted ads for them. The bill also gives consumers rights over their data. But India isnt the only country looking into such data regulations. Indias bill borrows heavily from the European General Data Protection Regulation (GDPR), and governments worldwide are also considering such regulations. Big Tech firms are fighting lawsuits against government bodies, competition regulations and more. The outcome of all these cases will impact how our data is used in the future.

Mixing and blending realities: In 1964, an animated science-fiction franchise called Jonny Quest imagined a virtual world called QuestWorld. The protagonists would put on futuristic virtual reality (VR) headsets and fight battles in a virtual world. It was futuristic then, but VR and augmented reality (AR) headsets are all too familiar now. In fact, they have been for almost a decade now. But in 2021, Facebook launched a product called Ray-Ban Stories, partnering with eyeglass maker Ray-Ban for a pair of smart glasses that look and feel almost exactly like regular spectacles. Tech firms aim to make these devices ubiquitous and reach economies of scale that comes from selling millions of devices worldwide.

Immutable and interconnected ledgers: If AI was the key change maker over the past decade, blockchain might well enable the next step in the technology. According to many estimates, India has become one of the top players in cryptocurrency adoption worldwide, but whats seen as a trading asset today has more significant implications. Cryptocurrencies are powered by blockchain technology, and in April, the International Data Corp. said that organizations would spend as much as $6.6 billion on blockchain solutions in 2021 alonea 50% increase from 2020. The market researcher also predicted an annual average growth rate of 48% between 2020 and 2024. Indias second crypto unicorn, CoinSwitch Kuber, has said that it aims to support other blockchain firms in India. Industry stakeholders and experts understand that blockchains will power cross-border payments, banking and much more in future. Even the Reserve Bank of Indias upcoming Central Bank Digital Currency, or a digital rupee, will be powered by blockchain technologies.

The third generation of the internet: The hit HBO show Silicon Valley has imagined a new internet void of dominance by Big Tech firms, governments and more. The idea may sound utopian, but thats exactly what companies building apps for the third generation of the internet (web3) are building today. Companies like Google, Apple, Facebook and others benefit greatly from the fact that most of the worlds data flows through their servers. However, with web3, the power is handed back to the users in a way. It runs without servers, depends on a network of phones, computers and other devices, and bars any one person or entity on the network to wield control on datain a word, decentralization. For instance, Noida-based Ayush Ranjan has built the worlds first decentralized video chat app. Unlike Google Meet, Zoom, the Huddle 01 app doesnt require users to create an account, and the company doesnt have its own data centres to store your data in or record calls. Instead, it stores all the data in a decentralized manner and uses computing power from users devices to power the calls.

Rise of the metaverse: 5G, cloud computing, IoT, web3 are all tools in a larger vision that technologists and technology leaders have right now. And thats called the metaverse. Facebooks Mark Zuckerberg is so confident that the metaverse is coming that he rebranded his company, one of the most valuable in the world, to Meta as an effort to show where his focus is today. Author Neal Stephenson is often credited with coining the term in his 1992 novel Snow Crash, and it has also been explored in contemporary movies like Ready Player One. The metaverse is not a technology; it is a concept. Zuckerberg and others expect that we will do everything from conducting meetings to hosting parties in a virtual space and through very realistic looking avatars. Instead of shopping on an e-commerce store, the avatar will walk into a virtual store, try on a product and have the physical product delivered to our homes too. However, hardware veterans like Intels Raja Koduri have warned that the computing power we have today is nowhere close to being sufficient for the metaverse Zuckerberg imagines.

Quantum computing: That brings us to what could be the most transformational trend in technologyquantum computing. Any country with aspirations to be a leader in technology has its sights set on quantum computing. While web3 is a new internet, quantum computing establishes a whole new computer. Our traditional computers can take information in 0 and 1, and their computations are limited by this. Quantum computers, on the other hand, use concepts of quantum physics to enhance the amount of computing power we can use. A quantum computer is far from reality right now, and it could be the kind of computing power Koduri says we need for the metaverse. In the 2020 Budget, the government had allocated 8,000 crore over the next five years for developing quantum computing tech. It has also launched a Quantum Simulator, which allows researchers to build quantum applications without a real computer.

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Read the original:

10 technology trends that could prove to be real game-changers - Mint

Senator reflects on a year in service to NM – Albuquerque Journal

As I took my oath of office in January, I was struck by the absence of friends and family, who, if not for the pandemic, would have been at each senators side. The emptiness of the Senate Chamber brought into focus the year of challenges that lay ahead. Though I stood alone as I swore my oath, I stand with New Mexico always and am proud of what we accomplished during this difficult year.

To address the pandemics effects on our lives, my colleagues and I passed landmark legislation that addresses New Mexicos specific needs and combats the consequences of COVID, both economically and in terms of public health.

In March, Congress passed the American Rescue Plan, which put money in pockets, children back in schools and parents back to work. Following through on the work I did in the House, this law provided more than $360 billion in emergency funding for state, local and tribal governments to keep front-line workers on the job.

One of my first Senate achievements was authorizing $17 billion in the United States Innovation and Competition Act for our national laboratories. These investments will empower our Sandia and Los Alamos labs to further research and develop such critical projects as semiconductors, carbon-capture technologies and quantum computing. Once passed and signed into law, this funding will allow New Mexico to continue its innovative leadership.

To further strengthen New Mexicos economy, my colleagues and I helped send the Infrastructure and Jobs Investment Act to President Bidens desk. This bipartisan legislation included my REGROW Act, which employs skilled energy workers to clean up tens of thousands of orphaned oil and gas wells across the country, creating an estimated 13,500 good-paying jobs.

This law contains my RIDE Act, which will make our roads safer by helping to end drunken and impaired driving. Additionally, I fought to provide billions for Indian Health Service (IHS) water and wastewater infrastructure, a long overdue investment. As chair of the Subcommittee on Communications, Media and Broadband, I also advocated for expanded access to broadband, especially in rural communities, making internet access more affordable for nearly 800,000 New Mexicans with an estimated $750 million going to our state to support broadband buildout.

Throughout the year, I used my committee positions to advocate for the well-being of all New Mexicans. I introduced the bipartisan Native American Voting Rights Act to protect the sacred right to vote for tribal nations and voters living on tribal lands. In various committee hearings, I challenged the power of Big Tech by grilling CEOs on harmful algorithms that prioritize profit margins at the expense of our health and our democracy. My Committee assignments allow me to serve New Mexicos working families.

Though I enjoy policymaking in Washington, connecting with my constituents remains the highlight of my job. While COVID made it difficult to visit with constituents in person, traveling across 28 counties in the past year and hearing the needs of families firsthand makes it all the more gratifying to share how this Democratic Congress has delivered. From meeting with local officials across southern New Mexico to having meaningful dialogues with tribal and pueblo leaders to opening a new constituent services office in Las Vegas, I cherish opportunities to hear from you and underscore how I am working in Washington to lift up our states communities. As this year concludes and the pandemic rages on, I remain committed to bringing New Mexican values to the policies we craft, defending our states priorities, and ensuring economic opportunity for you and your family.

Excerpt from:

Senator reflects on a year in service to NM - Albuquerque Journal