Page 49«..1020..48495051..6070..»

Category Archives: Quantum Computing

Quantum computing venture backed by Jeff Bezos will leap into public trading with $1.2B valuation – GeekWire

Posted: February 9, 2022 at 1:38 am

A team member at D-Wave Systems, based in Burnaby, B.C.,, works on the dilution refrigerator system that cools the processors in the companys quantum computer. (D-Wave Systems Photo / Larry Goldstein)

Burnaby, B.C.-based D-Wave Systems, the quantum computing company that counts Jeff Bezos among its investors and NASA among its customers, has struck a deal to go public with a $1.2 billion valuation.

The deal involves a combination with DPMC Capital, a publicly traded special-purpose acquisition company, or SPAC. Its expected to bring in $300 million in gross proceeds from DPMCs trust account, plus $40 million in gross proceeds from investors participating in a PIPE arrangement. (PIPE stands for private investment in public equity.)

Quantum computing takes advantage of phenomena at the quantum level, processing qubits that can represent multiple values simultaneously as opposed to the one-or-zero paradigm of classical computing. The approach is theoretically capable of solving some types of problems much faster than classical computers.

Founded in 1999, D-Wave has focused on a type of technology called quantum annealing, which uses quantum computing principles and hardware to tackle tasks relating to network optimization and probabilistic sampling.

Physicists have debated whether D-Waves Advantage system should be considered an honest-to-goodness quantum computer, but the company says that question has been settled by research that, among other things, turned up signatures of quantum entanglement. D-Wave is included among the quantum resources offered by Amazon and Microsoft, and it also has its own cloud-based platform, known as Leap.

The SPAC deal has already been cleared by the boards of directors for D-Wave and DPCM Capital. If the transaction proceeds as expected, with approval by DPCMs stockholders, it should close by midyear. The result would be a combined company called D-Wave Quantum Inc. that would remain headquartered in Burnaby a suburb of Vancouver, B.C. and trade on the New York Stock Exchange under the QBTS stock symbol.

Today marks an inflection point signaling that quantum computing has moved beyond just theory and government-funded research to deliver commercial quantum solutions for business, D-Wave CEO Alan Baratz said in a news release.

Among the investors involved in the PIPE transaction are PSP Investments, NEC Corp., Goldman Sachs, Yorkville Advisors and Aegis Group Partners. Other longtime D-Wave investors include Bezos Expeditions as well as In-Q-Tel, a venture capital fund backed by the CIA and other intelligence agencies.

In what was described as an innovative move, the SPAC deal sets aside a bonus pool of 5 million shares for DPCMs non-redeeming public stockholders.

D-Wave says it will use the fresh funding to accelerate its delivery of in-production quantum applications for its customers, and to build on a foundation of more than 200 U.S. patents. The company is aiming to widen its offerings beyond quantum annealing by developing more versatile gate-model quantum computers.

Emil Michael, DPMC Capitals CEO, said the total addressable market for quantum computing services could amount to more than $1 billion in the near term, and rise to $150 billion as applications mature.

While quantum computing is complex, its value and benefits are quite simple: finding solutions to problems that couldnt be previously solved, or solving problems faster with more optimal results, Michael said. D-Wave is at the forefront of developing this market, already delivering the significant benefits of quantum computing to major companies across the globe.

Read the original here:

Quantum computing venture backed by Jeff Bezos will leap into public trading with $1.2B valuation - GeekWire

Posted in Quantum Computing | Comments Off on Quantum computing venture backed by Jeff Bezos will leap into public trading with $1.2B valuation – GeekWire

Breaking the noise barrier: The startups developing quantum computers – ComputerWeekly.com

Posted: at 1:38 am

Today is the era of noisy intermediate scale quantum (Nisq) computers. These can solve difficult problems, but they are said to be noisy, which means many physical qubits are required for every logical qubit that can be applied to problem-solving. This makes it hard for the industry to demonstrate a truly practical advantage that quantum computers have over classical high-performance computing (HPC) architectures.

Algorithmiq recently received $4m in seed funding to enable it to deliver what it claims are truly noise-resilient quantum algorithms. The company is targeting one specific application area drug discovery and hopes to work with major pharmaceutical firms to develop molecular simulations that are accurate at the quantum level.

Algorithmiq says it has a unique strategy of using standard computers to un-noise quantum computers. The algorithms it is developing offer researchers the ability to boost the speed of chemical simulations on quantum computers by a factor of 100x compared with current industry benchmarks.

Sabrina Maniscalco, co-founder and CEO at Algorithmiq and a professor of quantum information, computing and logic at the University of Helsinki, has been studying noise quantum computers for 20 years. My main field of research is about extracting noise, she said. Quantum information is very fragile.

In Maniscalcos experience, full tolerance requires technological advances in manufacturing and may even require fundamental principles to be discovered because the science does not exist yet. But she said: We can work with noisy devices. there is a lot we can do but you have to get your hands dirty.

Algorithmiqs approach is about making a mindset shift. Rather than waiting for the emergence of universal fault-tolerant quantum computing, Maniscalco said: We look for what types of algorithms we can develop with noisy [quantum] devices.

To work with noisy devices, algorithms need to take account of quantum physics in order to model and understand what is going on in the quantum computer system.

The target application area for Algorithmiq is drug discovery. Quantum computing offers researchers the possibility to simulate molecules accurately at the quantum level, something that is not possible in classical computing, as each qubit can map onto an electron.

According to a quantum computing background paper by Microsoft, if an electron had 40 possible states, to model every state would have 240 configurations, as each position can either have or not have an electron. To store the quantum state of the electrons in a conventional computer memory would require more than 130GB of memory. As the number of states increases, the memory required grows exponentially.

This is one of the limitations of using a classical computing architecture for quantum chemistry simulations. According to Scientific America, quantum computers are now at the point where they can begin to model the energetics and properties of small molecules, such as lithium hydride.

In November 2021, a consortium led by Universal Quantum, a University of Sussex spin-out company, was awarded a 7.5m grant from Innovate UKs Industrial Strategy Challenge Fund to build a scalable quantum computer. Its goal is to achieve a million qubit system.

Many of todays quantum computing systems rely on supercooling to just a few degrees above absolute zero to achieve superconducting qubits. Cooling components to just above absolute zero is required to build the superconducting qubits that are encoded in a circuit. The circuit only exhibits quantum effects when supercooled, otherwise it behaves like a normal electrical circuit.

Significantly, Universals quantum technology, based on the principle of a trapped ion quantum computer, can operate at much more normal temperatures. Explaining why its technology does not require supercooling, co-founder and chief scientist Winfied Hensinger said: Its the nature of the hardware platform. The qubit is the atom that exhibits quantum effects. The ions levitate above the surface of the chip, so there is no requirement on cooling the chip in order to make a better qubit.

Just as a microprocessor may run at 150W and operate at room temperature, the quantum computer that Universal Quantum is building should not require anything more than is needed in an existing server room for cooling.

The design is also more resilient to noise, which introduces errors in quantum computing. Hensinger added: In a superconducting qubit, the circuit is on the chip, so it is much harder to isolate from the environment and so is prone to much more noise. The ion is naturally much better isolated from the environment as it just levitates above a chip.

The key reason why Hensinger and the Universal Quantum team believe they are better placed to further the scalability of quantum computers is down to the cooling power of a fridge. According to Hensinger, the cooling needed for superconducting qubits is very difficult to scale to large numbers of qubits.

Another startup, Quantum Motion, a spin-out from University College London (UCL), is looking at a way to achieve quantum computing that can be industrialised. The company is leading a three-year project, Altnaharra, funded by UK Research and Innovations National Quantum Technologies Programme (NQTP), which combines expertise in qubits based on superconducting circuits, trapped ions and silicon spins.

The company says it is developing fault-tolerant quantum computing architectures. John Morton, co-founder of Quantum Motion and professor of nanoelectronics at UCL, said: To build a universal quantum computer, you need to scale to millions of qubits.

But because companies like IBM are currently running only 127-qubit systems, the idea of universal quantum computing comprising millions of physical qubits, built using existing processes, is seen by some as a pipedream. Instead, said Morton: We are looking at how to take a silicon chip and make it exhibit quantum properties.

Last April, Quantum Motion and researchers at UCL were able to isolate and measure the quantum state of a single electron (the qubit) in a silicon transistor manufactured using a CMOS (complementary metal-oxide-semiconductor) technology similar to that used to make chips in computer processors.

Rather than being at a high-tech campus or university, the company has just opened its new laboratory just off Londons Caledonian Road, surrounded by a housing estate, a community park and a gym. But in this lab, it is able to lower the temperature of components to a shade above absolute zero.

James Palles-Dimmock, COO of Quantum Motion, said: Were working with technology that is colder than deep space and pushing the boundaries of our knowledge to turn quantum theory into reality. Our approach is to take the building blocks of computing the silicon chip and demonstrate that it is the most stable, reliable and scalable way of mass manufacturing quantum silicon chips.

The discussion Computer Weekly had with these startups shows just how much effort is going into giving quantum computing a clear advantage over HPC. What is clear from these conversations is that these companies are all very different. Unlike classical computing, which has chosen the stored program architecture described by mathematician John von Neumann in the 1940s, there is unlikely to be one de-facto standard architecture for quantum computing.

Continued here:

Breaking the noise barrier: The startups developing quantum computers - ComputerWeekly.com

Posted in Quantum Computing | Comments Off on Breaking the noise barrier: The startups developing quantum computers – ComputerWeekly.com

Quest for Quantum: Could the key to the secrets of the universe be discovered in Colorado? BizWest – BizWest

Posted: at 1:38 am

What if I told you that the laws of traditional physics arent enough to unlock the secrets of the universe?

What if I said that there are people devoted to a fundamentally new way of understanding science that could revolutionize the way we discover new drug therapies, map the cosmos, protect sensitive data, combat climate change and maybe even discover new forms of life?

What if I told you that youre living smack dab in the middle of the place where this new movement a technological and theoretical system ripped from the pages of a science-fiction novel is being puzzled out?

What if Boulder and the Front Range [was] the Silicon Valley of quantum computing? Maybe it is, ColdQuanta Inc. president Paul Lipman prognosticates.

Quantum has the opportunity to be as transformative for the world or more so than the internet, he told BizWest.

Quantum researchers are trying to get at the big questions in hopes of discovering and better understanding new types of physics that are beyond our current framework and theoretical understanding of the universe, Colorado State University physics professor Sam Brewer said.

The Boulder Valley with the world-class University of Colorado physics department, the National Institute of Standards and Technology, and some of the most prominent quantum computing companies in the world has become, over the last three decades or so, the epicenter of the field.

You wouldnt know it unless youre in the industry or are really into this technology, but we really have a special thing going in Colorado, said Philip Makotyn, executive director of CUbit Quantum Initiative, a CU organization dedicated to catalyzing quantum research across the university community.

What is quantum?

Quantum theory attempts to explain the behavior of matter at atomic and subatomic levels.

At its most fundamental level, the universe is governed by the laws of quantum mechanics, said Lipman, leader of the Boulder firm developing quantum technology to freeze individual atoms to near-absolute zero, a point at which they produce minimal vibration. In this state, those atoms can be used to create sensors with extremely granular accuracy for use in satellite navigation, scientific research and other cutting-edge technological pursuits.

Quantum computing uses principles of quantum theory to build machinery with capabilities that far exceed traditional computers.

Classical computers use bits that can hold the value of either 1 or 0, which severely limits their processing ability.

Quantum computers are built with qubits, which harness the quantum property of superposition to hold the value of both 1 and 0 simultaneously.

If a bit is a coin sitting on a table, with the property of heads or tails, a qubit is a coin thats been flipped into the air and has yet to land it simultaneously has the properties of heads and tails.

Computing power is increased exponentially with each additional qubit.

Simply put or as simply put as can be given a field of science thats so new and potentially groundbreaking quantum computers are not going to give you faster video games or better accounting software, Lipman said. But it will enable computers to solve important, world-changing problems that are simply impossible to solve with conventional computing capacities.

Born at NIST

Quantum computings birth can be traced directly back to NIST and the 14th International Conference on Atomic Physics held there in 1994.

It was at this conference that certain scientific breakthroughs were made and it was agreed that quantum computers will be a thing, NIST physicist and CU lecturer Dietrich Leibfried said.

This conference happened here for a reason and thats because these scientists were here in Boulder, he said.

But why NIST? Why was the government agency tasked with standardizing measurements a hotbed for quantum scholars?

The answer: clocks.

The institute is well known for its super-precise atomic clocks, and it turns out that the ideal setting for building quantum computers is much like the environment needed to build atomic clocks.

The first place you can describe quantum advantage as being achieved is in timekeeping, Lipman said. We can create clocks based upon quantum that are the most sensitive instruments ever created by humankind.

The importance of ultra-precise timekeeping is apparent in applications from investing in financial markets to developing autonomous vehicles and precision navigation to the measurement of the warping of time and space.

NIST furthered its involvement in the quantum space with research conducted by JILA, a partnership with CU formed in 1962 that was formerly known as the Joint Institute for Laboratory Astrophysics.

It really is a special place, Makotyn said. It attracted the best talent that existed at the time in this precision measurement and spectroscopy area that eventually evolved into the quantum field.

The Boulder regions status in the quantum world has only grown in recent years after the passage of the National Quantum Initiative Act of 2018, in which the National Science Foundation declared quantum technology a national priority and established five centers of research across the country. One of those centers is at CU.

It was a very, very competitive search for those places and all of the most prestigious universities applied. CU was one of the first three awarded. That shows that we are truly a national leader in this stuff, Makotyn said.

The Quantum Economic Development Consortium, a group that brings together academics and business leaders to discuss ways in which quantum technology could benefit the American economy, is also run out of NIST.

From the ivory tower to the boardroom

From the ivory towers, thats where it started. Not in the commercial world, Leibfried said of the quantum revolution.

But it didnt take long for the commercial world to catch on, and the Boulder Valley region has since given rise to some of the biggest players in the industry. Examples include ColdQuanta; Quantinuum, a Broomfield company newly spun out of Honeywell International Inc. (Nasdaq: HON); Atom Computing; and LongPath Technologies Inc.

The whole genesis of our company grew out of the work that was being done at CU the first Bose-Einstein condensate, the fifth form of matter, was created in the labs, Lipman said. Thats what led to the creation of ColdQuanta.

While some companies spun out of CU, JILA and NIST, others sought out the region to share proximity with those major quantum players.

Theres talent in this area, and we came to them, Quantinuum president Tony Uttley said. We said, Were going to give you the kind of environment that you want to be working in, were going to give you access to the equipment thats state of the art. Were going to provide the resources and ambition to do things that are the first of their kind.

The interplay between the industry participants and academia has created a positively reinforcing cycle and venture capital is starting to take notice, Lipman said.

We are seeing a significant interest in quantum from a wide array of financial partners, he said.

Part of the reason for this interest is the seemingly endless number of applications for quantum technology many of the applications appear quite lucrative.

When you think of drug discovery today, its a multi-year, multi-billion dollar thing where molecules are being tested in labs to figure out what their behavior and effects are, Lipman said. The promise of quantum computing and were still a few years away from this being possible is the ability to model precisely what the interaction of these molecules will be. That will be absolutely groundbreaking in pharmaceuticals, in life sciences, in material sciences. When you think about impact for humanity, that will probably be the most visceral of the potential applications.

Labor challenges

Like nearly all industries, the Boulder Valleys quantum industry is facing a severe labor shortage.

When you talk about the biggest impediment to progress and this is not just for ColdQuanta its the pipeline of talent, Lipman said.

Its not just physicists that these companies need.

We need a diverse group not just physicists, Makotyn said. We need engineers, mathematicians, business people, a broad range of skill sets.

Familiarizing younger people with the opportunities presented by quantum technology is key, industry experts say.

Theres a place for us to do more on the state and local level in Colorado, Lipman said. That encouragement or education, the funding of education, pushing that education down into the high school level, is an important part of creating the necessary pipeline of talent.

Read the original here:

Quest for Quantum: Could the key to the secrets of the universe be discovered in Colorado? BizWest - BizWest

Posted in Quantum Computing | Comments Off on Quest for Quantum: Could the key to the secrets of the universe be discovered in Colorado? BizWest – BizWest

8 Quantum Computing Applications You Should Know | Built In

Posted: February 7, 2022 at 6:29 am

Slowly but surely, quantum computing is getting ready for its closeup.

Google made headlines in October upon proclaiming that it had achieved the long-anticipated breakthrough of quantum supremacy. Thats when a quantum computer is able to perform a task a conventional computer cant. Not in a practical amount of time, anyway. For instance, Google claimed the test problem it ran would have taken a classical computer thousands of years to complete though some critics and competitors called that a gross exaggeration.

IBM, for one, wasnt having it. The other big player in quantum, it promptly posted a response essentially arguing that Google had underestimated the muscle of IBM supercomputers which, though blazingly fast, arent of the quantum variety.

Tech giant head-butting aside, Googles achievement was a genuine milestone one that further established quantum computing in the broader consciousness and prompted more people to wonder, What will these things actually do?

But even once quantum computing reigns supreme, its potential impact remains largely theoretical hence the hedging throughout in this article. Thats more a reflection, though, of QCs still-fledgling status than unfulfilled promise.

Before commercial-scale quantum computing is a thing, however, researchers must clear some major hurdles. Chief among them: upping the number of qubits, units of information that quantum computers use to perform tasks. Whereas classical computer bits exist as 1s or 0s, qubits can be either or both simultaneously. Thats key to massively greater processing speeds, which are necessary to simulate molecular-level quantum mechanics.

Despite quantums still-hypothetical nature and the long road ahead, predictions and investment abound. Google CEO Sundar Pichai likened his companys recent proof-of-concept advancement to the Wright brothers 12-second flight: though very basic and short-lived, it demonstrated whats possible. And whats possible, experts say, is impressive indeed.

From cybersecurity to pharmaceutical research to finance, here are some ways quantum will facilitate major advancements.

Location: Armonk, New York

How its using quantum computing: Recent research into whether quantum computing might vastly improve weather prediction has determined its a topic worth researching! And while we still have little understanding of that relationship, many in the QC field view it as a notable use case.

Ray Johnson, the former CTO at Lockheed Martin and now an independent director at quantum startup Rigetti Computing, is among those whove indicated that quantum computings method of simultaneous (rather than sequential) calculation will likely be successful in analyzing the very, very complex system of variables that is weather. Futurist Bernard Marr has echoed the sentiment.

While we currently use some of the worlds most powerful supercomputers to model high-resolution weather forecasts, accurate numerical weather prediction is notoriously difficult. In fact, it probably hasnt been that long since you cursed an off-the-mark meteorologist.

Location: NYC

How its using quantum computing: The list of partners that comprise Microsofts so-called Quantum Network includes a slew of research universities and quantum-focused technical outfits, but precious few business affiliates. However, two of the five NatWest and Willis Towers Watson are banking interests. Similarly, at IBMs Q Network, JPMorgan Chase stands out amid a sea of tech-focused members as well as government and higher-ed research institutions.

That hugely profitable financial services companies would want to leverage paradigm-shifting technology is hardly a shocker, but quantum and financial modeling are a truly natural match thanks to structural similarities. As a group of European researchers wrote last year, [T]he entire financial market can be modeled as a quantum process, where quantities that are important to finance, such as the covariance matrix, emerge naturally.

A lot of recent research has focused specifically on quantums potential to dramatically speed up the so-called Monte Carlo model, which essentially gauges the probability of various outcomes and their corresponding risks. A 2019 paper co-written by IBM researchers and members of JPMorgans Quantitative Research team included a methodology to price option contracts using a quantum computer.

Its seemingly clear risk-assessment application aside, quantum in finance could have a broad future. If we had [a commercial quantum computer] today, what would we do?" Nikitas Stamatopoulos, a co-author of the price-options paper, wondered. "The answer today is not very clear."

Location: Redmond, Wash.

How its using quantum computing: The world has a fertilizer problem that extends beyond an overabundance of poop. Much of the planets fertilizer is made by heating and pressurizing atmospheric nitrogen into ammonia, a process pioneered in the early 1900s by German chemist Fritz Haber.

The so-called Haber process, though revolutionary, proved quite energy-consumptive: some three percent of annual global energy output goes into running Haber, which accounts for more than one percent of greenhouse gas emissions. More maddening, some bacteria perform that process naturally we simply have no idea how and therefore cant leverage it.

With an adequate quantum computer, however, we could probably figure out how and, in doing so, significantly conserve energy. In 2017, researchers from Microsoft isolated the cofactor molecule thats necessary to simulate. And theyll do that just as soon as the quantum hardware has a sufficient qubit count and noise stabilization. Googles CEO recently told MIT he thinks the quantum improvement of Haber is roughly a decade away.

Location: London

How its using quantum computing: To presidential candidate Andrew Yang, Googles quantum milestone meant that no code is uncrackable. He was referring to a much-discussed notion that the unprecedented factorization power of quantum computers would severely undermine common internet encryption systems.

But Googles device (like all current QC devices) is far too error-prone to pose the immediate cybersecurity threat that Yang implied. In fact, according to theoretical computer scientist Scott Aaronson, such a machine wont exist for quite a while. But the looming danger is serious. And the years-long push toward quantum-resistant algorithms like the National Institute of Standards and Technologys ongoing competition to build such models illustrates how seriously the security community takes the threat.

One of just 26 so-called post-quantum algorithms to make the NISTs semifinals comes from, appropriately enough, British-based cybersecurity leader Post-Quantum. Experts say the careful and deliberate process exemplified by the NISTs project is precisely what quantum-focused security needs. As Dr. Deborah Franke of the National Security Agency told Nextgov, "There are two ways you could make a mistake with quantum-resistant encryption: One is you could jump to the algorithm too soon, and the other is you jump to the algorithm too late.

Location: Toronto

How its using quantum computing: The real excitement about quantum is that the universe fundamentally works in a quantum way, so you will be able to understand nature better, Googles Pichai told MIT Technology Review in the wake of his companys recent announcement. Its early days, but where quantum mechanics shines is the ability to simulate molecules, molecular processes, and I think that is where it will be the strongest. Drug discovery is a great example.

One company focusing computational heft on molecular simulation, specifically protein behavior, is Toronto-based biotech startup ProteinQure. Flush with $4 million in recent seed funding, it partners with quantum-computing leaders (IBM, Microsoft and Rigetti Computing) and pharma research outfits (SRI International, AstraZeneca) to explore QCs potential in modeling protein.

Thats the deeply complex but high-yield route of drug development in which proteins are engineered for targeted medical purposes. Although its vastly more precise than the old-school trial-and-error method of running chemical experiments, its infinitely more challenging from a computational standpoint. As Boston Consulting Group noted, merely modeling a penicillin molecule would require an impossibly large classical computer with 10-to-the-86th-power bits. For advanced quantum computers, though, that same process could be a snap and could lead to the discovery of new drugs for serious maladies like cancer, Alzheimers and heart disease.

Cambridge, Mass.-based Biogen is another notable company exploring quantum computings capacity for drug development. Focused on neurological disease research, the biotech firm announced a 2017 partnership with quantum startup 1QBit and Accenture.

Related20 Quantum Computing Companies Making Mind-Blowing Breakthroughs

Location: Stuttgart, Germany

How its using quantum computing: QCs potential to simulate quantum mechanics could be equally transformative in other chemistry-related realms beyond drug development. The auto industry, for example, wants to harness the technology to build better car batteries.

In 2018, German car manufacturer Daimler AG (the parent company of Mercedes-Benz) announced two distinct partnerships with quantum-computing powerhouses Google and IBM. Electric vehicles are mainly based on a well-functioning cell chemistry of the batteries, the company wrote in its magazine at the time. Quantum computing, it added, inspires justified hope for initial results in areas like cellular simulation and the aging of battery cells. Improved batteries for electric vehicles could help increase adoption of those vehicles.

Daimler is also looking into how QC could potentially supercharge AI, plus manage an autonomous-vehicle-choked traffic future and accelerate its logistics. It follows in the footsteps of another major Teutonic transportation brand: Volkswagen. In 2017, the automaker announced a partnership with Google focused on similar initiatives. It also teamed up with D-Wave Systems, in 2018.

Location: Wolfsburg, Germany

How its using quantum computing: Volkswagens exploration of optimization brings up a point worth emphasizing: Despite some common framing, the main breakthrough of quantum computing isnt just the speed at which it will solve challenges, but the kinds of challenges it will solve.

The traveling salesman problem, for instance, is one of the most famous in computation. It aims to determine the shortest possible route between multiple cities, hitting each city once and returning to the starting point. Known as an optimization problem, its incredibly difficult for a classical computer to tackle. For fully realized QCs, though, it could be a cakewalk.

D-Wave and VW have already run pilot programs on a number of traffic- and travel-related optimization challenges, including streamlining traffic flows in Beijing, Barcelona and, just this month, Lisbon. For the latter, a fleet of buses traveled along distinct routes that were tailored to real-time traffic conditions through a quantum algorithm, which VW continues to tweak after each trial run. According to D-Wave CEO Vern Brownell, the companys pilot brings us closer than ever to realizing true, practical quantum computing.

Location: Berkeley, Calif.

How its using quantum computing: Quantum computing and artificial intelligence may prove to be mutual back-scratchers. As VentureBeat recently explained, advances in deep learning will likely increase our understanding of quantum mechanics while at the same time fully realized quantum computers could far surpass conventional ones in data pattern recognition. Regarding the latter, IBMs quantum research team recently found that entangling qubits on the quantum computer that ran a data-classification experiment cut the error rate in half compared to unentangled qubits.

What this suggests, an essay in the MIT Technology Review noted, is that as quantum computers get better at harnessing qubits and at entangling them, theyll also get better at tackling machine-learning problems.

IBMs research came in the wake of another promising machine-learning classification algorithm: a quantum-classical hybrid run on a 19-qubit machine built by Rigetti Computing.

Harnessing [quantum computers statistical distribution] has the potential to accelerate or otherwise improve machine learning relative to purely classical performance, Rigetti researchers wrote. The hybridization of classical compute and quantum processors overcame a key challenge in realizing that aim, they explained.

Both are important steps toward the ultimate goal of significantly accelerating AI through quantum computing. Which might mean virtual assistants that understand you the first time. Or non-player-controlled video game characters that behave hyper-realistically. The potential advancements are numerous.

I think AI can accelerate quantum computing,"Google's Pichai said,"and quantum computing can accelerate AI.

RelatedQuantum Computers Will Transform How We Make & Play Video Games

Go here to read the rest:

8 Quantum Computing Applications You Should Know | Built In

Posted in Quantum Computing | Comments Off on 8 Quantum Computing Applications You Should Know | Built In

Quantum Computing and Cybersecurity: A Fusion that Cannot be Ignored – Analytics Insight

Posted: at 6:21 am

Companies must be aware of the fusion between quantum computing and cybersecurity

New technological innovations are transforming economies and enhancing our living standards through increased productivity and reduced cost of production. In lieu of this technological evolution, hackers and cyber scammers are innovating new and innovative methods to hack into the systems of individuals and companies and steal the large amounts of data that are being generated with the help of data analytics and AI tools. Hence, cybersecurity has become an integral part of all business strategies and is a means to protect data from intruders. Cybersecurity enables professionals to protect any information available on the devices to assess future risks. One of the key players in this cybersecurity is quantum computing. To replace classical computing and deliver long-standing results, researchers and scientists are exploring quantum computing as a robust tool to enhance the effectiveness of cybersecurity platforms.

Even though there are various ways in which the implications of quantum computing can harm cybersecurity, there are several other qualities of quantum computing that can deliver exponential advantages for certain classes of problems, for example, factoring very large numbers, with profound implications on cybersecurity.

One of the biggest worries that cybersecurity analysts are facing currently is the emergence of new devices that are based on quantum physics and are considered to be superior to standard computers. These devices have the reputation of enabling cyber attackers to break into secure cryptography methods. Classical digital ciphers rely on complex mathematical formulas to convert data into encrypted messages for storage and transmission. Consequently, attackers can break into these cryptography codes and steal confidential information.

Cybersecurity experts believe that eventually, quantum computing developers will pose threats to the national security of a country due to their ability to break into modern cryptography systems and reveal encrypted messages and stored data. Besides, hackers can adopt advanced technologies, such as machine learning skills to develop and distribute deadly forms of malware.

Frauds and cybercriminals can also take advantage of the powers of quantum computing to create various novel approaches to breach cybersecurity firewalls. Even though such activities can also be computationally tolling on classic computers, with the integration of quantum computing technology, hackers can take advantage of its advanced features and create sophisticated attacks on larger networks of devices and networks.

Quantum computing is not just about the doom of cybersecurity applications. The technology can also help create robust cybersecurity encryption methods. With the help of privacy-enhancing computing (PEC) techniques, professionals can keep the data encrypted while in use and can also provide in-transit and at-rest protections. Since data privacy is a hot topic for individuals and business leaders, PEC can be deployed to create stronger encryption models. Also, homomorphic encryption can be deployed that enables the third parties to process encrypted data and provide results without ever having the knowledge of either. This type of encryption can use lattices, or some multi-dimensional algebraic constructs that would be impossible for intruders to crack. Cybersecurity is a good potential solution for different types of cybersecurity and encryption issues. Security-concerned companies must definitely understand the importance of quantum flexibility.

Quantum computing is a fast-approaching technology in the cybersecurity domain. Companies need to immediately leap into action and analyze the different ways to deploy quantum to enhance security and block intruders from stealing confidential data. Currently, the industry is witnessing a substantial increase in investment in solving the core problems around scaling error correction and algorithms. Enterprises should start thinking strategically about the long-term risks and the benefits of quantum computing technology and engage in serious ways to deploy the best practices of cybersecurity.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Analytics Insight is an influential platform dedicated to insights, trends, and opinions from the world of data-driven technologies. It monitors developments, recognition, and achievements made by Artificial Intelligence, Big Data and Analytics companies across the globe.

Original post:

Quantum Computing and Cybersecurity: A Fusion that Cannot be Ignored - Analytics Insight

Posted in Quantum Computing | Comments Off on Quantum Computing and Cybersecurity: A Fusion that Cannot be Ignored – Analytics Insight

In review: Top five Danish scientific discoveries of the year – The Post – The Copenhagen Post – Danish news in english

Posted: at 6:21 am

The year-that-went was one of general precarity, as fragile attempts at economic recovery were steamrolled by fresh waves of COVID,

Meanwhile, the arts industries squared up to asphyxiating restrictions, while meeting friends to soothe the nerves was marred by anxiety and awkward distancing.

Science thrivingBut necessity is the mother of invention and, in the face of all the challenges, science has thrived.

In Denmark, the worlds first wind-energy island was proposed (and South Korea quickly followed suit), a new species of whale was identified, and a dodgy GPS signal led to the accidental discovery of the worlds most northerly island.

Biggest impactThe biggest revelations hit the headlines far beyond Denmarks borders.

Selected for their impact, innovation and socio-cultural significance, here are Denmarks top five most exciting scientific discoveries of 2021.

1 Innovative chip resolves quantum computing headacheIn October, two young physicists at the University of Copenhagen put us a step closer to building the first large, functional quantum computer. A little background: the brain of a supercomputer is made up of memory devices called qubits. While an ordinary bit can store data in a state of either 1 or 0, a qubit can reside in both states simultaneously known as quantum superposition. Until now, researchers have managed to build small circuits, in which only one qubit can be operated at once. So it was hailed as a global milestone when Professor Federico Fedele and Assistant Professor Anasua Chatterjee simultaneously operated and measured multiple spin qubits on the same quantum chip the entirety of which was no larger than a single bacterium.

2 Broad horizons for broad beansBroad beans are rich in protein, easy to grow and Jamie Oliver the standard marker of populist cuisine makes a dip out of them. But they also produce vicin poisonous to over 400 million people in the world who are predisposed to the hereditary disorder favism. For these people, primarily in Asia, Africa and the Mediterranean countries, vicin can trigger acute anemia and liver disorders. In July, researchers in Copenhagen and Aarhus isolated the gene responsible for forming vicin in broad beans, paving the way for a vicin-free bean that could become a major future protein-source or make a globally appealing appetiser.

3 Metal detector rookie finds 1kg trove of ancient goldIn an absurd stroke of beginners luck, a man called Ole Ginnerup Schytz found fame in September when he picked up a metal detector for the first time and discovered a stunning cache of 6th century gold jewellery near the town of Jelling in Denmark. Experts dubbed it one of the most valuable archaeological finds in Denmarks history on a par with the Golden Horns of Gallehus. The hoards enormous wealth points to a prolific European trade network and suggests Jelling was a major seat of power. The jewellery will go on display at Vejlemuseerne in southern Jutland on 3 February 2022, before being rehomed in the National Museum.

4 Hotrocks could spell the end for lithium batteriesIn May, construction began on GridScale a remarkable new energy plant on the island of Lolland that can store renewable energy in stone. The method involves super-heating and super-cooling crushed pea-sized basalt in insulated steel tanks. The stone can store heat for many days and supply energy for up to a week outperforming lithium batteries on both cost and efficiency. The 35 million kroner plant was quickly dubbed hotrocks by the international community and will continue to be a hot topic into 2022, when moves will be made to integrate it in the Danish national grid.

5 Malaria medicine found to combat COVID-19In December, researchers at Aarhus University discovered that the malaria medicine Atovaquone can prevent COVID-19 infection. The medicine has a protective effect both before and after infection across different viral variants meaning it could be used for both prevention and treatment of COVID-19. Its a big find: Atovaquone is inexpensive, widely available and has already been approved by the US Food and Drug Administration. Its yet to be tested on Omicron, however, and has only been studied using lab-grown human cells.

Excerpt from:

In review: Top five Danish scientific discoveries of the year - The Post - The Copenhagen Post - Danish news in english

Posted in Quantum Computing | Comments Off on In review: Top five Danish scientific discoveries of the year – The Post – The Copenhagen Post – Danish news in english

Aggelos Kiayias interview: Will blockchain be good for the planet? – VentureBeat

Posted: at 6:21 am

Did you miss a session from GamesBeat's latest event? Head over to the GamesBeat & Facebook Gaming Summit & GamesBeat Summit: Into the Metaverse 2 On Demand page here.

Will blockchain technology be good for the planet? Ive been asking people about that and the first impression is that it will be incredibly wasteful using networks of computers to redundantly verify transactions for cryptocurrencies, nonfungible tokens (NFTs), and ultimately the metaverse. The second impression is that it could be a lot less wasteful shifting to cryptocurrency than relying on our current financial system. I talked about this with one of the experts in the field, and we delved into these myths on a more factual level.

Kiayias is chief scientist at Input Output, a blockchain engineering company. Its the driving force behind the Cardano blockchain, which is a third-generation cryptocurrency, which tries to improve on second-generation cryptocurrencies such as Ethereum and first-generation cryptocurrencies like Bitcoin.

A decentralized cryptocurrency keeps track of all transactions by all addresses on a peer-to-peer shared record. One of Cardanos innovations is the ability to support high-transaction capacities, fast transaction times, and low transaction fees, through a system of proof-of-stake. Cardano is not minable. The blockchain is a record of all transactions, but rather than validation by anyone who performs the proof-of-work, transactions are validated by consensus proof-of-stake. This is also far more environmentally friendly than mining other cryptocurrencies, as they are not using enormous amounts of electricity.

Input Output builds blockchain based products for governments, corporations and academic institutions and upskill people across the world, and it is focused on making blockchain technology that is environmentally friendly. Thats going to be necessary as we move toward the metaverse, the universe of virtual worlds that are all interconnected, like in novels such asSnow CrashandReady Player One. (Kyle Wiggers wrote a piece on the environmental impact of the metaverse).

Three top investment pros open up about what it takes to get your video game funded.

Input Output has contributed $500,000 to Stanford University to allow blockchains to process many transactions even with limited connectivity, providing support for the Tse Lab at the Department of Electrical Engineering of Stanford University. The aim is to enable even smartphones to handle blockchain transactions, while offline, to lessen the amount of electricity used in transactions.

Heres our edited transcript with Kiayias, who is

also chair in cybersecurity and privacy at the University of Edinburgh.

VentureBeat: I wanted to hear more about the work youve been doing. Could you start there? Id like to hear about the background, and the interest youve had in the energy usage of blockchain.

Aggelos Kiayias: Ive been working in cryptography since about the mid-90s. I was a mathematician originally. Cryptography was my passion when I was an undergraduate. It was a very different time, as you know. But were in an exciting time now, especially for those of us working in this area. A lot of these ideas we had on a blackboard, now the time has come to see them implemented, deployed, and used by people. Its a great time.

With respect to energy efficiency, that was one of the very early open questions in the space. Bitcoin was doing something amazing. It was capable of providing an IT service without having any centralized entity that provides it. Its a self-registered service. You could be part of the service provision by just registering yourself as a node. Thats pretty remarkable. It was unheard-of to deploy an IT service like that before. Its great to focus on Bitcoin from this angle, because it moves away the cryptocurrency, so to speak. Theres a lot to learn from Bitcoin just from studying how its possible to deploy a global scale IT service like that, automatically.

At the same time, the observation was, early onwere talking about very soon after Bitcoin was deployed. You had this huge energy expenditure, and it was only getting worse. Another thing that makes the situation even worse than just the energy expenditure is that the product is agnostic of the source of energy. It motivates you to find the cheapest possible energy, and the cheapest energy that you can find might not be an energy that we want to harness. It might be based on non-renewable carbon-based fuels and so forth.

Early on, then, there was an open question around how it was possible to create the service, create the system like that of Bitcoin, but without having the same energy expenditure. That was something that motivated me. It was more than seven years ago when I started working on this, this particular question. Soon we were looking at various flavors of proof of stake. The problem is, none of these protocols were designed in a way that you could be convinced that they would actually work. A lot of my effort was to apply this mathematical rigor that we apply in traditional cryptographic protocol design. Its not so traditional, because cryptography is a very young discipline, but lets say traditional with respect to what you would get in a system like Bitcoin, which is very new, the new kid on the block so to speak.

The question is, is it possible to apply that same mathematical rigor and extract a protocol that can give you the benefits of Bitcoin, this decentralized service provision, an automatic service that can appear out of the self-interest of the nodes that join the network? And it turned out that its possible to do it. We were very good at not only designing the Ouroboros protocol, but getting it deployed, and in general influencing a lot of deployment. The protocol I designed not only found itself as the backbone of Cardano, which Im sure youre familiar with, but also influenced the design of other systems. Polkadot used elements of Ouroboros to design their system, as well as in other efforts.

This is also characteristic of a lot of the work we do. This is work we make publicly available. We have a tradition of scientific peer review. We try to put it in the context of the scientific development of cybersecurity and computer science at large. We validate it not just with cryptocurrency experts, but we validate it in a scientific way with people who have experience in designing computer science protocols, and specifically cryptographic protocols. That gives you a bit of the story behind some of that work.

Above: Aggelos Kiayias is chief scientist at Input Output, which is behind the Cardano blockchain.

Image Credit: Input Output

VentureBeat: What do you think about the energy usage of the different protocols that are out there? If you look at Bitcoin or Ethereum or what some of the Layer-2 solutions do, taking the energy consumption down quite dramatically, to the point where its maybe not as big a concern anymoreare there still some things to be concerned about?

Kiayias: I wouldnt say Im optimistic that Layer-2 is going to help much. The protocol itself, which is the backboneas long as Layer-2 fundamentally relies on Layer-1, and if that Layer-1 is a proof of work-based Layer-1, then the only thing that makes it work is energy expenditure. Energy expenditure is going to go up as long as engaging in this mining operation is profitable. If Layer-1 has high utility, then having high utility is alsoit could also be that it supports a very successful Layer-2 thats also part of its utility. Even if Layer-1 is just the mediator of all the various things that happen at Layer-2, its still going to be a critical part of the infrastructure. Its still going to be profitable to mine with it. Its still going to consume vast amounts of energy. The design of the protocol itself is such that it consumes this amount of energy.

Now, this is not to say that the protocol is flawed. Its designed exactly to do that. The question is whether its worth it. We do a lot of things on planet Earth that consume vast amounts of energy. Bitcoin is one of them. The question we have to constantly ask ourselves is whether this is the best use of the resources we have, the technology we have, at any given time. Technology, after all, is always about finding and optimizing designs, proving them, understanding the problem you want to solve, and finding the most cost-efficient way to do it. In many ways Bitcoin showed the way. However, the way it spends its resources, it doesnt seem, at the moment, to be the best possible use of what we spend versus what we get out of it. Thats not aligning.

VentureBeat: As far as something like Ethereum forking over to proof of stake, do you believe that solves a lot of the problem? Or do we still have problems around that?

Kiayias: It certainly removes the energy expenditure fact. Any proof of stake protocolthere are differences between protocols. There are plenty of differences between proof of stake protocols. But in terms of energy expenditure, proof of stake protocols are similar in the sense that they use relatively small levels of energy, on par with running traditional servers. The difference with proof of work is tremendous. Any proof of stake protocol certainly makes the energy expenditure problem essentially disappear, at least at the level were seeing right now with Bitcoin. Thats only one dimension, though. Energy expenditure is not the only issue that blockchain systems have.

VentureBeat: If they do that, do they trade off something like security?

Kiayias: At the abstract level, you can think of it as a different security assumption. Theyre both secure. But if you look at the theory of security, security is always achieved under certain assumptions. Or to put it differently, its very rare to have a system that has unconditional security, as we say. There do exist some very simple systems in the history of cryptography that are unconditionally secure, but theyre extremely limited in applicability. Typically what you have in security is security proofs or security arguments that are conditional upon certain assumptions or certain behaviors or certain things that youd consider plausible.

Proof of stake or proof of work, they come with different assumptions. Theyre both conditional, I should emphasize. But the set of conditions are different. This doesnt simply mean that one is less secure than the other. In both cases, the conditions that you can argueof course, Im not talking about Ethereum specifically now. Im speaking very broadly about abstract, ideal proof of stake design. Both of these systems can be argued convincingly to be secure under a plausible set of conditions that are different in both cases.

VentureBeat: Ive also heard a very different argument about how the global financial system as it is can be very wasteful. That physical banks use a lot of energy, paper-based systems use a lot of resources. The comparison between that system and a system based on cryptocurrency seems like there is a big difference there. I dont know what youd think of that.

Kiayias: Ive seen this argument as well, but I dont buy it. Bitcoin, just Bitcoin, cannot substitute for the world financial system. A bank is not just a ledger. They have a whole infrastructure that includes customer service, backup systems around what to do when things go wrong. Its endless. There are a lot of other items there. Its possible to make such a comparison, but the places Ive seen this comparison being madeits too simplistic, comparing a brick and mortar world that includes things like customer service with just a ledger. Thats not realistic. Bitcoin, the ledger, is by itself insufficient as a drop-in replacement for the whole banking system.

VentureBeat: I guess its more of a comparison between centralized financial services and decentralized finance.

Kiayias: Yes, thats true. A comparison like that could be made. But I havent seen it being made in a convincing way. And I should say, even if someone makes this comparison convincingly, why not use less energy? Even if Bitcoin is more competitive energy-wise, we should still ask the question, whats the absolute minimum of energy we need to get a service like that? Just because Bitcoin is better than System X, that doesnt meanthe question here is not just what system is better, but if you want to do the job that System X does, whats the absolute minimum amount of energy we need?

This is the way we ask questions in computer science. Computer scientists look at questions like this. The problem, lets say, is sorting an array. The question is, whats the minimum number of comparisons you need? Its not about finding one algorithm and proving it. Its finding the best possible algorithm that does it. Im keen on doing this. This is what motivates my research and the research we do at Input Output. Certainly I think this is the right question to ask.

Above: Meld is a interesting new way to borrow money through the blockchain.

Image Credit: Meld

VentureBeat: I wrote about Ubisofts effort to bring NFTs into their games recently. They said that they were using Tezos, and that it was a lot more energy efficient. A transaction in Tezos was using a millionth of the energy of a Bitcoin transaction, so you dont have the gas fees. It was more like the equivalent of a couple of Google searches. It sounded good, but it seemed like the problem is that not many people are going to use that protocol.

Kiayias: Tezos is a proof of stake system, so these numbers are not surprising. Bitcoin is immensely more energy-hungry compared to any proof of stake system out there right now. In terms of the infrastructure, from the point of view of the end user, these things shouldnt matter too much. The end user tries to play a game or tries to exchange some artifacts they have, some art. Where the user does it is something that many end users are not too concerned about themselves.

But what are concerned about here, as far as the basic infrastructure questionwhat is a sound infrastructure to base the security of these transactions upon? This is the question that motivates the work were doing. This is the right question to ask. Theres plenty of systems out there, but the real question to ask is, are they secure? Have they been analyzed? What are their credentials? Were still at the very beginning of assessing system security in the blockchain space.

VentureBeat: Do you worry that were already wed to Bitcoin and Ethereum to the point where we might not be able to change?

Kiayias: No, I dont think so, to be honest. Were still very early. Were exploring so many different ways of doing this. There are a lot of things that everyone in the space is learning. Were still way too early to say that somehow the first mover advantage is going to eliminate possibilities for other systems beyond Bitcoin and Ethereum to be successful. Theres certainly a lot of room for that.

You could even imagine a setting where theres a multitude of systems that are all interoperable, with trustless bridges connecting them and the ability for people to seamlessly transfer assets between systems, without actually caring about how it works. If you look at the internet itself, its a connection of quite diverse networking backbones, but we dont care now. The packets coming from my machine reach yours and vice versa. Everything is happening seamlessly in the background because the internet infrastructure has been optimized to work like this.

This is a viable future for the blockchain space. You have a lot of systems that interoperate. Eventually what you expect is that some of the systems will go away and some of the other systems will grow, depending on their ability to scale, have sound economics, and interoperate with existing systems and legacy infrastructure. Thats very important. You cant imagine this technology in complete isolation. It must interoperate with existing infrastructure and financial systems and so forth.

VentureBeat: Do you think, for things like NFTs and games, that theres an obvious solution yet, an ideal protocol?

Kiayias: I should say theres a lot of research to do even in the NFT space. The basic backbone of what is an NFT is well-understood, and there are systems that can give you that. I have to point out, however, that Ive also seen a lot of subpar implementation of the NFT concept. In other words, even though the expertise is there around how to provide the basic NFT, its a bit unfortunate that I also see a lot of insecure implementations of that concept in different platforms. However, at this point in time, there are systems, Cardano included, that can give you a sound implementation of the NFT concept. They can be readily used by those that want to use such systems.

VentureBeat: If blockchain overcomes the energy challenge, do you think thats the main challenge that it faces in adoption? Or do you think there are other challenges, other problems that have to be overcome?

Kiayias: No, there are more. There are definitely more. There are a few different ones that require expertise from different areas. Certainly energy efficiency is obviously one of them, but then we also have scalability. The work that we hinted at in the beginning of this conversation with Stanford has to do with scalability. Some of the research that the team at Stanford that were funding through this gift, its working around this next generation of scalability for Layer-1 operations. Layer-1 is an ever-expanding database. The question is, is it possible to have small, finite nodes to support that infrastructure without sacrificing the security of the system?

So far, blockchains like Bitcoin, Ethereum, and others, theyre quite monolithic in what they consider the concept of a full node. This is something that knows everything and checks everything. Obviously, by itself, this is not scalable in the future. Were talking about a neverending, ever-expanding database. We definitely need to solve that problem, and without a lot of work, a lot of research, both within Input Output and with partners like Stanfordtheres a lot of work weve done with others confronting the scalability problem.

Above: The metaverse could use a lot of energy.

Image Credit: Fold

Meanwhile, in all of these systems, you self-register because youre incentivized to do so. Theres a lot of research that still needs to be done to validate the soundness of the mechanisms that these systems use to incentivize the participants. For example, you can see in recent times all the debate that exists around pricing transactions. There are big issues in Bitcoin, and Ethereum as well, but Bitcoin has transactions which are extremely expensive. You can start asking the same principal questions. Whats the best way, if you want to auction this desirable space for transactions on the blockchainwhats the best way to price it so that you can still not break the incentives of the system participants?

These are just two examples where we do active research. There are lots of open questions to solve.

VentureBeat: Every now and then I also hear that quantum computing is going to be a threat to blockchain. It seems like it could also be used to defend blockchains. I dont know how serious a problem that is.

Kiayias: We take it seriously, very seriously. Its definitely not feasible at the moment with the quantum computing capabilities that exist right now. Its still an open question as to how well these techniques are going to scale. But its a very serious concern, and we have to understand how to develop protocols that, as we call it in security sometimes, are post-quantum secure. How can you develop a protocol that, after a quantum computer exists, still retains its security?

The good news, and perhaps something that people misunderstand sometimes, is that you do not need a quantum computer to be protected from a quantum adversary. Its possible to develop classical algorithms, classical cryptography techniques, that are secure even against quantum attackers. Thats something weve demonstrated in other areas of security, for example secure communications between websites, clients and servers. Its something that we understand how to do in a way thats post-quantum secure. Bringing this technology and understanding the right technology for the blockchain space is an ongoing research effort. Im quite active in this myself right now, together with my colleagues, but theres a general effort toward making blockchains post-quantum secure. Its something we take seriously, but its definitely within the realm of feasibility for the next few years, to have blockchains which are completely quantum safe, so to speak.

VentureBeat: Whats your own feeling about the potential for the decentralized internet, decentralized web, versus what we have right now thats centralized around big tech companies?

Kiayias: Im very enthusiastic about this. This is going to be one of the best applications of blockchain technology. Right now, what we observe is that all this centralization around the big corporations that basically silo a lot of informationin many ways it can be argued that it should be more of a public resource and less of something that one specific company should be able to capitalize on. There are a lot of issues of basic human rights. Do I have a right to see what information is collected about me? Do I have the right to transfer my information? Do I have the right to erase some of the information thats collected about me? These are basic questions that concern people in the IT law space, the legal aspects of information technology.

Blockchain systems do provide a lot of tools that can be used to make this situation better from the point of view of regulation. These ideas sometimes are called reg tech, regulatory technology. These are techniques that could be very useful in the future, and could upgrade the way we do regulatory compliance. Lets say regulatory compliance could catch up with the times. Im very optimistic about this. Some of the regulations we use right now are very antiquated. Were in a world where big corporations can just do regulatory arbitrage. They have a department in some jurisdiction where they operate because its advantageous for them to be there. Its a perversion of how the system should work on a global scale.

I do think that building these applications on top of a substrate of blockchain technology could somehow elevate the internet into something that has memory, something that has the ability for users to engage in a neutral space, without being locked into a particular company. This is a very promising direction for blockchain technology, and I think thats going to have a big impact on what we do in the next few years.

VentureBeat: Do you think theres some limited form of decentralization thats more ideal than complete decentralization? Do we need to form lots of DAOs in order to replace companies, things like that?

Kiayias: Its a great question. Whats the right level of decentralization for a particular type of application or system? Im very confident that theres going to be a wide spectrum. We cannot decentralize everything. Not everything makes sense to be decentralized. We still want to have services that operate as centralized entities in one way or another, just because there are needs for peak performance or agility.

Decentralized systems, no matter how you design them, are highly distributed. Their responsiveness will always be beaten by a super efficient, optimized centralized system. As the saying goes, the benevolent dictatorship is the best system that you could ever have, because theres only one dictator, he acts in everyones best interests, and he can act immediately and solve every problem without any delay. The only problem is that theres a scarcity of benevolent dictators.

Thats something we can solve with decentralization. Weve solved it historically with democracy. As we can see in the way human societies have evolved over thousands of years, theyve moved to settings where you have some centralization. You have a president or a prime minister. But you also have decentralized components of operation. You have elections. You have hierarchies of management. You have separation of duties, separation of parts of the government. This is not new. If you study political systems you can see a diverse landscape of some decentralized processes, some centralized, and a lot of checks and balances that glue everything together. I think exactly the same thing is going to happen in information technology.

Above: Illuvium is creating a DAO.

Image Credit: Illuvium

VentureBeat: What are the areas of research that youre most excited about? What do you think is worth a lot of your time right now?

Kiayias: At this particular point in time, Ill mention scalability, which Im working on very actively with the team at Input Output. We have a lot of questions about how to optimize these functions. Making the system scalable, it also has to be agile, because different use cases on top of a blockchain require different types of operations. You want to create a system that somehow shapes itself according to a particular use case. Scaling for the occasion is always an important concern. Theres a lot of research going on about this right now.

I mentioned economics and game theory. Another topic Im working on actively right now with the team is governance. I cant emphasize this enough. Sound governance of a decentralized project on blockchain systems that are live is extremely important for their long-term success. One of the biggest problems weve seen in the space so far is systems getting into trouble because they cant properly manage the way that the system evolves. Its impossible, if we look at the history of software systems, to have the perfect system that stays there forever. What we know instead is that, first of all, bugs do happen. They have to be patched and corrected expediently. At the same time, circumstances do change. You have to be able to adapt. Governance and software updates in the decentralized setting is an extremely important question. Its something were very actively working on. So those are three examples of the top research streams we have going on right now.

VentureBeat: Are there any subjects where youre worried about the state of things, the direction that were going?

Kiayias: In the long term, nothing specifically. The whole space is maturing rapidly. We will be able to solve the main questions. Im very optimistic about the future, about the whole direction were going. I see a lot of bright people in the industry space. I see a lot of good expertise participating. I believe all of the major questions will be solved, from a technological point of view.

About the social aspects, one issue that worries me is that a lot of the research and a lot of the development were doing is driven collectively, the whole blockchain space by computer scientists and software engineers. We know from the history of recent information technology services like Facebook that you can do things, if you dont have an interdisciplinary approach of understanding the questions that youre trying to solveits possible to create systems that can do a lot of harm. Weve seen this, in the case of Facebook, with the spread of misinformation that has hurt public health in some cases during the pandemic, as well as in the case of Myanmar. There are a lot of cases where well-intentioned technology development has led to adverse effects.

What Im trying to do, and something were striving toward at Input Output, is to promote interdisciplinary work and research. Thats something I look forward to being more active with in 2022 and further on, so we develop solutions that really solve the important problems we want to solve, without creating new problems as happened with the hasty development of information technology services that weve seen in other cases, like Facebook as I mentioned.

VentureBeat: Do you have some confidence that, say, within five years well be able to solve the energy usage problem? Can we get to an ideal protocol?

Kiayias: First of all, I think energy usage is something that weve taken care of already. Cardano and other systems are going in this direction. We have a sound infrastructure that is not energy-hungry. But as for all the other questions, because there are many other questionsgovernance, as I mentioned, is a key question. Scalability in all use cases is a key question. The next five-year window is enough time to develop systems that are resilient and reasonably capable of evolving so that they are exceptionally long-lived. Im confident that the research and development weve done so far, and thats going to happen in the next five years, will take us to that point.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn More

See the article here:

Aggelos Kiayias interview: Will blockchain be good for the planet? - VentureBeat

Posted in Quantum Computing | Comments Off on Aggelos Kiayias interview: Will blockchain be good for the planet? – VentureBeat

Study Estimates Global Quantum Computing Market Earned $490M in 2021 – HPCwire

Posted: February 5, 2022 at 5:24 am

ST. PAUL, Minn.,Feb. 4, 2022 A new study conducted by Hyperion Research and sponsored by the Quantum Economic Development Consortium (QED-C) and QC Ware with assistance from the European Quantum Industry Consortium and Quantum Industry Canada today announced that the global quantum computing (QC) market earned an estimated $490 millionin 2021. The market is anticipated to expand at an annual rate of 21.9 percent through 2024.

The estimate is strongly data-driven, drawn from a survey of 112 quantum computing vendors headquartered inNorth America,Europe, the Asia/Pacific Region, and theMiddle East. Survey respondents spanned the range of the QC ecosystem, including QC algorithm and QC application software developers, QC hardware developers and product providers, and QC venture capital organizations.

This newest market study confirms that the global quantum computing sector will exhibit stable and robust growth for at least the next few years. A growing list of QC-end users around the world likely will continue to fuel such growth, attracted by the increasing base of new and innovative QC- based applications and use cases critical to their overall advanced computing requirements, according toBob Sorensen, Chief Analyst for Quantum Computing, Hyperion Research.

Additional insights from the study included:

This study is the second annual QC market forecast by Hyperion Research and underwritten by QED-C and QC Ware. The findings, which were recently presented at the 2021 Q2B conference, will help inform decisions made by QC developers interested in rapidly changing QC market trends and opportunities; national-level policymakers tasked with QC-related R&D funding support, procurement policies, trade, and QC-specific security consideration; current and future QC end-users looking to gauge the pace and progress of the sector; and various corporate and venture capital entities assessing the technology and market potential of the sector. Regular updates to these studies will track the growth of the QC industry and help vendors, investors, and policymakers understand the evolving landscape of the quantum computing ecosystem. Future studies will be presented at the annual Q2B conference, held in the second week of December inCalifornia.

Celia Merzbacher, Executive Director of QED-C said, This data-driven study provides a snapshot of where the industry is now and where it is headed in the next three to five years. With broader coverage compared to the survey one year ago, the consistent results year-over-year provide added confidence in the results.

We believe that building practical quantum computing applications, which is our main mission, is not a zero-sum game, said Yianni Gamvros, Head of Business Development, QC Ware. QC Ware is actively investing and cares deeply about the health of the entire quantum computing ecosystem. Support for this report is one of many community initiatives that we are undertaking to ensure the entire space is healthy, transparent, and growing as quickly as possible.

About Quantum Economic Development Consortium

The Quantum Economic Development Consortium (QED-C) is an industry-driven consortium managed by SRI International with the mission to support a robust U.S. QIST industry and related supply chain. QED-C is supported by the National Institute of Standards and Technology (NIST) in the U.S. Department of Commerce and more than 160 members, including more than 110 U.S. corporations from across the quantum supply chain including component suppliers/manufacturers, software and hardware system developers, service providers and end users. Visit https://quantumconsortium.org.

About QC Ware

QC Wareis a quantum software and services company focused on ensuring enterprises are prepared for the emerging quantum computing disruption. QC Ware specializes in thedevelopment of applications for near-term quantum computing hardware with a team composed of some of the industrys foremost experts in quantum computing. Its growing network of customers includes AFRL, Aisin Group, Airbus, BMW Group, Equinor, Goldman Sachs, and Total. QC Ware Forge, the companys flagship quantum computing cloud service, is built for data scientists with no quantum computing background. It provides unique, performant, turnkey quantum computing algorithms. QC Ware is headquartered inPalo Alto, California, and supports its European customers through its subsidiary inParis. QC Ware also organizes Q2B, the largest annual gathering of the international quantum computing community. Visit https://www.qcware.com.

About Hyperion Research

Hyperion Research provides data-driven research, analysis and recommendations for technologies, applications, and markets in high performance computing and emerging technology areas, such as quantum computing, to help organizations worldwide make effective decisions and seize growth opportunities. Research includes market sizing and forecasting, share tracking, segmentation, technology, and related trend analysis, and both user and vendor analysis for technical server technology used for traditional HPC, high performance data analysis, and AI workloads.Bob Sorensen(bsorensen@hyperionres.com) is Chief Analyst for Quantum Computing. Visit https://hyperionresearch.com.

Source: Hyperion Research

Read more from the original source:

Study Estimates Global Quantum Computing Market Earned $490M in 2021 - HPCwire

Posted in Quantum Computing | Comments Off on Study Estimates Global Quantum Computing Market Earned $490M in 2021 – HPCwire

In Partnership with IBM, Canada to Get Its First Universal Quantum Computer – HPCwire

Posted: at 5:24 am

IBM today announced it will deploy its first quantum computer in Canada, putting Canada on a short list of countries that will have access to an IBM Quantum System One machine. The Canadian province of Quebec is partnering with IBM to establish the Quebec-IBM Discovery Accelerator to advance R&D within the fields of quantum computing, artificial intelligence, semiconductors and high-performance computing.

The collaboration will lay the foundation for novel energy materials and life science discoveries, according to the partners. The new technology hub is also focused on STEM education and skills development with an emphasis on supporting genomics and drug discovery.

The IBM Quantum System One is expected to be up and running at IBMs facility in Bromont, Quebec, by early next year, said Anthony Annunziata, IBMs director of accelerated discovery, in an interview with Reuters. IBM said the partnership will leverage the companys knowledge of semiconductor design and packaging.

The Quebec-IBM Discovery Accelerator is further proof of our commitment to building open communities of innovation to tackle the big problems of our time through a combination of quantum computing, AI and high-performance computing, all integrated through the hybrid cloud, said Dr. Daro Gil, senior vice president and director of Research, IBM.

The dedicated IBM quantum computer will pave the way for us to make incredible progress in areas such as artificial intelligence and modeling, said Franois Legault, Premier of Quebec. Quantum science is the future of computing. With our innovation zone, were positioning ourselves at the forefront of this future.

IBM has in the last twelve months announced similar partnerships with the Cleveland Clinic, the University of Illinois Urbana-Champaign and the UKs Science and Technology Facilities Council Hartree Centre. The Canadian Quantum One system marks the fifth global installation that IBM has announced, following engagements in the U.S., Germany, Japan and South Korea.

Canada has made quantum computing a high-priority research target, seeking to hone its technical and strategic edge in the global marketplace. A year ago, the government of Canada extended a $40-million contribution to quantum computing firm D-Wave Systems Inc. as part of a larger $120 million investment in quantum computing technologies. (Based in British Columbia, D-Wave has long championed quantum annealing-based quantum computing, but recently announced it was expanding into gate-based quantum computing.)

While IBM has primarily provided its quantum computing platform as a service, the company launched the IBM Quantum System One in 2019 as an on-premises offering, billed as the worlds first fully integrated universal quantum computing system.

Related:

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

IBM Bringing Quantum on-Prem for Cleveland Clinic

IBM Joins Effort to Build $200M AI, Cloud, Quantum Discovery Accelerator at the University of Illinois

Fraunhofer Goes Quantum: IBMs Quantum System One Comes to Europe

IBM and University of Tokyo Roll Out Quantum System One in Japan

IBM and Yonsei University Unveil Collaboration to Bring IBM Quantum System One to Korea

Here is the original post:

In Partnership with IBM, Canada to Get Its First Universal Quantum Computer - HPCwire

Posted in Quantum Computing | Comments Off on In Partnership with IBM, Canada to Get Its First Universal Quantum Computer – HPCwire

Multiverse Computing and Xanadu Partner on Quantum Software for Finance – insideHPC – insideHPC

Posted: at 5:24 am

TORONTO and SAN SEBASTIN, SPAIN Multiverse Computing, a maker of quantum computing software for the financial industry, and Xanadu, a full-stack photonic quantum computing company, announced today a joint partnership to expand Multiverses use of Xanadus open source software, PennyLane.

The partnership will enable Multiverses financial services clients to develop applications with greater speed and ease. These applications will enhance financial and banking intelligence in areas ranging from risk modeling to market forecasting.Led by Xanadus world-renowned team of scientists and developers, PennyLane has built a large and passionate following since its initial release three years ago.

PennyLane connects the most popular quantum computing platforms with the best machine learning tools using a device-agnostic and open-source approach, allowing users to train quantum computers the same way as neural networks.

With PennyLane at the core of Multiverses product suite, our financial services clients will gain access to tools and best practices in quantum programming, backed by one of the worlds largest open-source quantum communities, said Samuel Mugel, CTO ofMultiverse Computing. We see PennyLane as a critical tool for validating our product efforts, enhancing our ability to rapidly test and deploy new quantum capabilities across our financial user community.

We continue to see broader adoption of PennyLane with innovative startups like Multiverse. Xanadus open-source software is an excellent vehicle for accelerating development and reducing the time to market for new quantum products, said Rafal Janik, Xanadus Head of Product. The collective knowledge of Multiverses scientists and their clients provides feedback benefiting the broader open-source community and improving PennyLane.

Originally posted here:

Multiverse Computing and Xanadu Partner on Quantum Software for Finance - insideHPC - insideHPC

Posted in Quantum Computing | Comments Off on Multiverse Computing and Xanadu Partner on Quantum Software for Finance – insideHPC – insideHPC

Page 49«..1020..48495051..6070..»