12345...


Quantum Computers Finally Beat Supercomputers in 2019 – Discover Magazine

In his 2013 book, Schrdingers Killer App, Louisiana State University theoretical physicist Jonathan Dowling predicted what he called super exponential growth. He was right. Back in May, during Googles Quantum Spring Symposium, computer engineer Hartmut Neven reported the companys quantum computing chip had been gaining power at breakneck speed.

The subtext: We are venturing into an age of quantum supremacy the point at which quantum computers outperform the best classical supercomputers in solving a well-defined problem.

Engineers test the accuracy of quantum computing chips by using them to solve a problem, and then verifying the work with a classical machine. But in early 2019, that process became problematic, reported Neven, who runs Googles Quantum Artificial Intelligence Lab. Googles quantum chip was improving so quickly that his group had to commandeer increasingly large computers and then clusters of computers to check its work. Its become clear that eventually, theyll run out of machines.

Case in point: Google announced in October that its 53-qubit quantum processor had needed only 200 seconds to complete a problem that would have required 10,000 years on a supercomputer.

Nevens group observed a double exponential growth rate in the chips computing power over a few months. Plain old exponential growth is already really fast: It means that from one step to the next, the value of something multiplies. Bacterial growth can be exponential if the number of organisms doubles during an observed time interval. So can computing power of classical computers under Moores Law, the idea that it doubles roughly every year or two. But under double exponential growth, the exponents have exponents. That makes a world of difference: Instead of a progression from 2 to 4 to 8 to 16 to 32 bacteria, for example, a double-exponentially growing colony in the same time would grow from 2 to 4 to 16 to 256 to 65,536.

Neven credits the growth rate to two factors: the predicted way that quantum computers improve on the computational power of classical ones, and quick improvement of quantum chips themselves. Some began referring to this growth rate as Nevens Law. Some theorists say such growth was unavoidable.

We talked to Dowling (who suggests a more fitting moniker: the Dowling-Neven Law) about double exponential growth, his prediction and his underappreciated Beer Theory of Quantum Mechanics.

Q: You saw double exponential growth on the horizon long before it showed up in a lab. How?

A: Anytime theres a new technology, if it is worthwhile, eventually it kicks into exponential growth in something. We see this with the internet, we saw this with classical computers. You eventually hit a point where all of the engineers figure out how to make this work, miniaturize it and then you suddenly run into exponential growth in terms of the hardware. If it doesnt happen, that hardware falls off the face of the Earth as a nonviable technology.

Q: So you werent surprised to see Googles chip improving so quickly?

A: Im only surprised that it happened earlier than I expected. In my book, I said within the next 50 to 80 years. I guessed a little too conservatively.

Q: Youre a theoretical physicist. Are you typically conservative in your predictions?

People say Im fracking nuts when I publish this stuff. I like to think that Im the crazy guy that always makes the least conservative prediction. I thought this was far-out wacky stuff, and I was making the most outrageous prediction. Thats why its taking everybody by surprise. Nobody expected double exponential growth in processing power to happen this soon.

Q: Given that quantum chips are getting so fast, can I buy my own quantum computer now?

A: Most of the people think the quantum computer is a solved problem. That we can just wait, and Google will sell you one that can do whatever you want. But no. Were in the [prototype] era. The number of qubits is doubling every six months, but the qubits are not perfect. They fail a lot and have imperfections and so forth. But Intel and Google and IBM arent going to wait for perfect qubits. The people who made the [first computers] didnt say, Were going to stop making bigger computers until we figure out how to make perfect vacuum tubes.

Q: Whats the big deal about doing problems with quantum mechanics instead of classical physics?

A: If you have 32 qubits, its like you have 232 parallel universes that are working on parts of your computation. Or like you have a parallel processor with 232 processors. But you only pay the electric bill in our universe.

Q: Quantum mechanics gets really difficult, really fast. How do you deal with that?

A: Everybody has their own interpretation of quantum mechanics. Mine is the Many Beers Interpretation of Quantum Mechanics. With no beer, quantum mechanics doesnt make any sense. After one, two or three beers, it makes perfect sense. But once you get to six or 10, it doesnt make any sense again. Im on my first bottle, so Im in the zone.

[This story originally appeared in print as "The Rules of the Road to Quantum Supremacy."]

Read the original:

Quantum Computers Finally Beat Supercomputers in 2019 - Discover Magazine

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing – Analytics India Magazine

Quantum computing has come a long way since its first introduction in the 1980s. Researchers have always been on a lookout for a better way to enhance the ability of quantum computing systems, whether it is in making it cheaper or the quest of making the present quantum computers last longer. With the latest technological advancements in the world of quantum computing which superconducting bits, a new way of improving the world of silicon quantum computing has come to light, making use of the silicon spin qubits for better communication.

Until now, the communication between different qubits was relatively slow. It could be done by passing the messages to the next bit to get the communication over to another chip at a relatively far distance.

Now, researches at Princeton University have explored the idea of two quantum computing silicon components known as silicon spin qubits interacting in a relatively spaced environment, that is with a relatively large distance between them. The study was presented in the journal Nature on December 25, 2019.

The silicon quantum spin qubits give the ability to the quantum hardware to interact and transmit messages across a certain distance which will provide the hardware new capabilities. With transmitting signals over a distance, multiple quantum bits can be arranged in two-dimensional grids that can perform more complex calculations than the existing hardware of quantum computers can do. This study will help in better communications of qubits not only on a chip but also from one to another, which will have a massive impact on the speed.

The computers require as many qubits as possible to communicate effectively with each other to take the full advantage of quantum computings capabilities. The quantum computer that is used by Google and IBM contains around 50 qubits which make use of superconducting circuits. Many researchers believe that silicon-based qubit chips are the future in quantum computing in the long run.

The quantum state of silicon spin qubits lasts longer than the superconducting qubits, which is one of their significant disadvantages (around five years). In addition to lasting longer, silicon which has a lot of application in everyday computers is cheaper, another advantage over the superconducting qubits because these cost a ton of money. Single qubit will cost around $10,000, and thats before you consider research and development costs. With these costs in mind a universal quantum computer hardware alone will be around at least $10bn.

But, silicon spin cubits have their challenges which are part of the fact that they are incredibly small, and by small we mean, these are made out from a single electron. This problem is a huge factor when it comes to establishing an interconnect between multiple qubits when building a large scale computer.

To counter the problem of interconnecting these extremely small silicon spin qubits, the Princeton team connected these qubits with a wire which are similar to the fibre optic (for internet delivery at houses) wires and these wires carry light. This wire contains photon that picks up a message from a single qubit and transmits it the next qubit. To understand this more accurately, if the qubits are placed at a distance of half-centimetre apart from each other for the communication, in real-world, it would be like these qubits are around 750 miles away.

The next step forward for the study was to establish a way of getting qubits and photons to communicate the same language by tuning both the qubits and the photon to the same frequency. Where previously the devices architecture allowed tuning only one qubit to one photon at a time, the team now succeeded in tuning both the qubits independent from each other while still coupling them to the photon.

You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other,

Felix Borjans, a graduate student and first author on the study on what he describes as the challenging part of the work.

The researchers demonstrated entangling of electrons spins in silicon separated by distances more substantial than the device housing, this was a significant development when it comes to wiring these qubits and how to lay them out in silicon-based quantum microchips.

The communication between the distant silicon-based qubits devices builds on the works of Petta research team in 2010 which shows how to trap s single electron in quantum wells and also from works in the journal Nature from the year 2012 (transfer of quantum information from electron spins)

From the paper in Science 2016 (demonstrated the ability to transmit information from a silicon-based charge qubit to a photon), from Science 2017 (nearest-neighbour trading of information in qubits) and 2018 Nature (silicon spin qubit can exchange information with a photon).

This demonstration of interactions between two silicon spin qubits is essential for the further development of quantum tech. This demonstration will help technologies like modular quantum computers and quantum networks. The team has employed silicon and germanium, which is widely available in the market.

comments

Sameer is an aspiring Content Writer. Occasionally writes poems, loves food and is head over heels with Basketball.

Follow this link:

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing - Analytics India Magazine

The Impact of Quantum Computing on Banking will be gigantic says Deltec Bank, Bahamas – Quantaneo, the Quantum Computing Source

However, even with that progression, there are still jobs that classical computers are not powerful enough to do. The answer looks set to come from quantum computing. In this post, we will look at what quantum computing is and how it could revolutionize a long-standing industry such as banking.

What is Quantum Computing?

Quantum computers are expected to be a new kind of technology that can solve complex problems well beyond the capabilities of traditional systems. If you take an everyday problem like climate change, the intricacies of solving it are incredibly complex. A standard computer does not have the power or ability to even get close to genuinely understanding everything that is going on. The main reason is the endless amounts of data that computers need to process to generate an accurate decision.

A quantum computer is often referred to as a supercomputer. It has highly advanced processing power that can take masses of variables into account, helping predict weather patterns and natural disasters in the case of climate change.

Brief Technical Summary

A typical computer stores information in what is known as bits. In quantum computing, these are known as qubits. Qubits have certain properties that mean a connected group of them can provide way more processing power than binary bits from classical computing. In short, where binary bits store 1s and 0s to handle a task, qubits can represent numerous possible combinations of these simultaneously.

Practical Example

An example of this could be if running a travel agency. Lets say three people need to move from one place to another, Jenny, Anna and Steve. For that purpose, there are two taxis and the problem you want to solve is who gets into which taxi. However, we know that Jenny and Anna are friends, Jenny and Steve are enemies and Anna and Steve are enemies.

The aim would be to maximize the number of friend pairs and minimize the enemy pairs sharing the same taxi. A classical computer would store each possible solution with bits one at a time before being able to calculate a potential solution. However, a quantum computer will use qubits to represent all the solutions at the same time. It will find the best solution in a few milliseconds as it piles everything into just 1 operation.

The difference here is a traditional computer performs more and more calculations every time the data scales up, whereas a quantum computer will only ever have to process one operation.

In the real-world, one industry that could heavily benefit from this technology and processing power is banking.

Quantum Computing in Banking

In an article from Banco Bilbao Vizcaya Argentaria (BBVA) from October 2019, it was suggested that this kind of quantum computing power might fundamentally change the face of banking in time.

Encryption of personal data is critical to banking, with RSA-2048 being used at the highest levels. For a classical computer to find the key to decrypt the algorithm would take 1,034 steps. To put that into context, a processor capable of a trillion operations per second would still take 317 billion years to resolve the problem. Realistically, that makes decryption impossible.

However, a quantum computer could solve the decryption in just 107 steps. If the computer were running at a million operations per second, this calculation would only take 10 seconds to complete. The potential of quantum computing in this context is quite amazing. That said, we are still a long way off having enough processing power to reach those heights, but experts are working on it.

Barclays

Researchers at Barclays Bank in collaboration with IBM have created a proof-of-concept quantum optimized application. The solution revolves around the transaction settlement process. A settlement works on a transaction-by-transaction basis where they are pushed into a queue and settled in batches. During a processing window, as many trades as possible from the queue are settled.

Trades are complex by nature according to Lee Braine, director of research and engineering at Barclays. Traders can tap into funds before the transaction has been cleared. They are settled if funding is available or if there is some sort of credit collateral facility.

In a quantum computing context, a small number of trades could, in theory, be done in your head. However, as you get up to 10 or 20 transactions, you might need to use a pen and paper. Any more than that and we start going into classical computing. However, as we get to hundreds of trades, the machines begin to experience limitations.

A bit like the travel agency example we gave earlier, a quantum computer could run masses of complex aspects of trading. Using a seven-bit qubit system, the team could identify certain features that were of sufficient complexity. The same calculations would need about 200 traditional computers.

JP Morgan

Using an IBM machine, researchers at JP Morgan have demonstrated that they could simulate the future value of a financial product. They are testing the use of quantum computers to speed up intensive pricing calculations which would take traditional machine hours to compute. As portfolios become larger, the algorithms have greater complexity and could get to a point where they are impossible to calculate.

The research by the team has shown that a commercial-grade quantum computer can run the same calculations in a matter of seconds.

Summary

According to Deltec Bank, the Bahamas Banks are successfully testing quantum computers to solve problems that were previously very resource-intensive or impossible to complete. Although the technology is still some years away from changing the way banks calculate financial models due to complex hardware requirements, it is important to start testing now.

IBM themselves have stated they are a while away from a perfect solution with big breakthroughs still required but the time will certainly come.

See the article here:

The Impact of Quantum Computing on Banking will be gigantic says Deltec Bank, Bahamas - Quantaneo, the Quantum Computing Source

Information teleported between two computer chips for the first time – New Atlas

Scientists at the University of Bristol and the Technical University of Denmark have achieved quantum teleportation between two computer chips for the first time. The team managed to send information from one chip to another instantly without them being physically or electronically connected, in a feat that opens the door for quantum computers and quantum internet.

This kind of teleportation is made possible by a phenomenon called quantum entanglement, where two particles become so entwined with each other that they can communicate over long distances. Changing the properties of one particle will cause the other to instantly change too, no matter how much space separates the two of them. In essence, information is being teleported between them.

Hypothetically, theres no limit to the distance over which quantum teleportation can operate and that raises some strange implications that puzzled even Einstein himself. Our current understanding of physics says that nothing can travel faster than the speed of light, and yet, with quantum teleportation, information appears to break that speed limit. Einstein dubbed it spooky action at a distance.

Harnessing this phenomenon could clearly be beneficial, and the new study helps bring that closer to reality. The team generated pairs of entangled photons on the chips, and then made a quantum measurement of one. This observation changes the state of the photon, and those changes are then instantly applied to the partner photon in the other chip.

We were able to demonstrate a high-quality entanglement link across two chips in the lab, where photons on either chip share a single quantum state, says Dan Llewellyn, co-author of the study. Each chip was then fully programmed to perform a range of demonstrations which utilize the entanglement. The flagship demonstration was a two-chip teleportation experiment, whereby the individual quantum state of a particle is transmitted across the two chips after a quantum measurement is performed. This measurement utilizes the strange behavior of quantum physics, which simultaneously collapses the entanglement link and transfers the particle state to another particle already on the receiver chip.

The team reported a teleportation success rate of 91 percent, and managed to perform some other functions that will be important for quantum computing. That includes entanglement swapping (where states can be passed between particles that have never directly interacted via a mediator), and entangling as many as four photons together.

Information has been teleported over much longer distances before first across a room, then 25 km (15.5 mi), then 100 km (62 mi), and eventually over 1,200 km (746 mi) via satellite. Its also been done between different parts of a single computer chip before, but teleporting between two different chips is a major breakthrough for quantum computing.

The research was published in the journal Nature Physics.

Source: University of Bristol

More:

Information teleported between two computer chips for the first time - New Atlas

20 technologies that could change your life in the next decade – Economic Times

The decade thats knocking on our doors now the 2020s is likely to be a time when science fiction manifests itself in our homes and roads and skies as viable, everyday technologies. Cars that can drive themselves. Meat that is derived from plants. Robots that can be fantastic companions both in bed and outside.

Implanting kidneys that can be 3-D printed using your own biomaterial. Using gene editing to eradicate diseases, increase crop yield or fix genetic disorders in human beings. Inserting a swarm of nanobots that can cruise through your blood stream and monitor parameters or unblock arteries. Zipping between Delhi and New York on a hypersonic jet. All of this is likely to become possible or substantially closer to becoming a reality in the next 10 years.

Ideas that have been the staple of science fiction for decades artificial intelligence, universal translators, sex robots, autonomous cars, gene editing and quantum computing are at the cusp of maturity now. Many are ready to move out of labs and enter the mainstream. Expect the next decade to witness breakout years for the world of technology.

Read on:

The 2020s: A new decade promising miraculous tech innovations

Universal translators: End of language barrier

Climate interventions: Clearing the air from carbon

Personalised learning: Pedagogy gets a reboot with AI

Made in a Printer: 3-D printing going to be a new reality

Digital money: End of cash is near, cashless currencies are in vogue

Singularity: An era where machines will out-think human

Mach militaries: Redefining warfare in the 2020

5G & Beyond: Ushering a truly connected world

Technology: Solving the problem of clean water

Quantum computing : Beyond the power of classical computing

Nanotechnology: From science fiction to reality

Power Saver: Energy-storage may be the key to maximise power generation

Secret code: Gene editing could prove to be a game-changer

Love in the time of Robots: The rise of sexbots and artificial human beings

Wheels of the future: Flying cars, hyperloops and e-highways will transform how people travel

New skies, old fears: The good, bad& ugly of drones

Artificial creativity: Computer programs could soon churn out books, movies and music

Meat alternatives: Alternative meat market is expected to grow 10 times by 2029

Intelligent robots & cyborg warriors will lead the charge in battle

Why we first need to focus on the ethical challenges of artificial intelligence

It's time to reflect honestly on our motivations for innovation

India's vital role in new space age

Plastic waste: Environment-friendly packaging technologies will gain traction

Read more:

20 technologies that could change your life in the next decade - Economic Times

Top 5: Scientific Breakthroughs That Made 2019 an Unforgettable Year of Human Progress – The Weather Channel

Facial reconstruction of A. anamensis by John Gurche using 38-lakh-year-old (3.8-million-year-ago) hominin cranium.

From discovering cures for life-threatening diseases to exploring outer space, from unearthing new facts about human history to making incredible strides in artificial intelligence, humanity achieved exceptional breakthroughs in the field of science and technology in 2019.

As the year comes to an end, it is time to look back at some of those glorious scientific revolutions that will shape our future. Here are our picks for the most significant scientific advancements of 2019:

5. Hello Sun? Earthlings are going beyond your influence!

A simulated landing process of Chang'e-4 lunar probe at the Beijing Aerospace Control Center on Jan. 3, 2019.

Launched in January 2006, the interplanetary space probe New Horizons from the US space agency NASA steered past the Kuiper Belt object 486958 Arrokoth (then nicknamed Ultima Thule) on January 1, 2019. The Kuiper Belt is the region beyond the known planetary system of solar system, and this was the farthest flyby ever conducted by any human-made spacecraft.

Also this year, on November 4, NASA's Voyager 2 reached the interstellar mediuma space between star systems, well beyond the influence of our solar system. Voyager 1 had earlier achieved this feat in 2012. Voyager 2, its successor, was launched in the year 1977.

Also, China's moon mission, Chang'e 4, successfully made a soft landing on the far side of the Moonbecoming the first ever mission to do so. Named after the Chinese moon goddess, the mission is attempting to determine the age and composition of the Moon's unexplored region.

4. Quantum leap in computing

Representational image

Of all the progress made in computing research in 2019, the biggest breakthrough was perhaps the realisation of quantum computing.

Right in the first month of 2019, technology giant IBM unveiled Q System Onethe first quantum computer outside a research labbringing a rather abstract concept into the public imagination. Unlike the bits of information in computers we use, a quantum computer uses quantum bits, or qubits, enabling an exponential rise in the amount of data it can process and store.

Also Read: Rewind 2019: A Look Back at Significant Developments in Indian Science This Year

Further, a team of researchers from Australia and Singapore developed a quantum-powered machine that can accurately simulate future outcomes arising from different set of alternatives. Meanwhile, another study at Yale University showed that we can catch a qubit between the quantum jump and alter its outcomes. This was an exponential jump in fine-tuning the quantum systems as the outcomes need not be completely random and abrupt.

While other research also helped in conceptualising quantum drives with immense storage capacity, the biggest news was from Google. The search giant confirmed in October that it had achieved quantum supremacy. To put things in perspective, researchers at Google claim that the quantum computer solved in three minutes a problem that would have taken 10,000 years even for a supercomputer.

3. Revolutionary research in medical science

Representational image

Medical researchers are always striving to push the envelope of human resilience and efficiency. The year 2019 saw progress on both these fronts, with the development of potential cures for multiple life-threatening diseases and gene-editing promising to be more effective than ever.

This year, twin drugs were developed for Ebola and were found to be effective in nearly 90% of the cases, making the seemingly incurable condition treatable. Researchers also discovered potential cures for bubble boy disease, a condition where babies are born without disease-fighting immune cells, for cystic fibrosis, a painful, debilitating lung disease, as well as for pancreatic cancer.

Moreover, after decades, HIV research finally yielded some fruitful results this year with patients positively responding to treatments. After a long gap of 12 years from the day the first patient was cured of HIV infection that causes AIDS, another patient was cured in March 2019. Researchers had been relentlessly trying to replicate the treatment that cured the infection for the first time in 2007.

Furthermore, using CRISPR gene-editing technology, scientists have found potential treatments for cancer patients, even those with whom the standard procedure was not successful. In October, researchers produced scientific evidence that new gene-editing technology has the potential to correct up to 89% of genetic defects like sickle cell anaemia.

2. Imaging the faraway invisible wonder

Image of the black hole at the center of galaxy M87

Named the top scientific breakthrough of 2019 by the journal Science, this incredible photograph of a black hole was taken using eight radio telescopes around the world to form a virtual instrument that is said to be the size of the Earth itself.

The first-ever image of a black hole, released on April 10 this year, was taken by the Event Horizon Telescope (EHT) collaboration team. The gravity of a black hole is so strong that even light cannot escape its pull, and to capture an image of something that does not emit light is no easy task.

EHT imaged the silhouette (or shadow) of a massive black hole called M87 which is located at the centre of a galaxy 55 million light-years from Earth. M87 has enormous masswhopping 6500 million times the mass of the Sun. The image shows a ring of light coming from the gas falling into the event horizon (the boundary from beyond which nothing can escape) of the black hole.

1. Retracing the origins of humans

Craniofacial reconstruction process of Rakhigarhi cemetery individuals (BR02 and BR36).

Humankinds fascination with the question 'Where did we come from?' has persisted over centuries. Yet, some of the biggest breakthroughs in answering this question were made this year, starting with the discovery of a previously-unknown species of ancient humans. Named Homo luzonensis, this small-bodied bipedal species was discovered in the Philippines and is said to have lived on the island of Luzon 50,000 to 67,000 years ago.

In May, researchers deciphered a four-decade old mystery by identifying a 160,000-year-old human jawbone found in the Tibetian Plateau nearly 40 years ago. The fossil was of Denisovan, an enigmatic ancestor species of humans who ranged across Asia until some 50,000 years ago. The discoverymade despite the absence of DNA in the jawhelped scientists understand this species better. In September, another group of researchers further refined the picture of Denisovans whose traces still linger in the DNA of a few modern humans.

In August, descriptions of a nearly 38-lakh-year-old remains of a skull belonging to a bipedal ancestor of humans baffled the world. This skull proved that two of our ancestor speciesA. anamensis and A. afarensismay have overlapped for at least 100,000 years. This evidence of the existence of these two of our ancestor species at a similar timescale busts the long-held belief that human evolution follows a single lineage, i.e. one species coming after the other.

In a first-of-its-kind attempt, scientists have generated an accurate facial representation of people from the Indus Valley Civilisation in October. Nnother important study showed that the ancestral homeland of every human alive today traces back to a region south of the Zambezi River in northern Botswana. Building on the previous genetic evolution studies, the researchers used ethnolinguistic and geographic frequency distribution data from the genomes of over 1000 southern Africans to trace back the origin of modern humans.

Exponential growth continues

India has also contributed immensely in all scientific domains over the past few years and is now only behind China and the US in terms of the number of published research studies. Building exponentially on the success of previous decades, scientists around the world have made immense contributions from improving our daily life to understanding the mysteries of the universe.

With so much exciting research pouring in from all corners of the world, it isn't easy to even keep track of the incredible pace at which science is progressing. While we have tried to cover a few iconic annual scientific highlights in this article, there are thousands of other important discoveries, studies and achievements that shaped science in 2019.

And as yet another potential-filled year dawns on our planet, The Weather Channel India will keep you tuned in about all the exciting news, updates and breakthroughs from the world of science.

So for your daily dose of weather, environment, space and science stories, stay tuned to weather.com and stay curious!

Excerpt from:

Top 5: Scientific Breakthroughs That Made 2019 an Unforgettable Year of Human Progress - The Weather Channel

2020 will be the beginning of the tech industry’s radical revisioning of the physical world – TechCrunch

These days its easy to bemoan the state of innovation and the dynamism coming from Americas cradle of technological development in Silicon Valley.

The same companies that were praised for reimagining how people organized and accessed knowledge, interacted publicly, shopped for goods and services, conducted business, and even the devices on which all of these things are done, now find themselves criticized for the ways in which theyve abused the tools theyve created to become some of the most profitable and wealthiest ventures in human history.

Before the decade was even half over, the concern over the poverty of purpose inherent in Silicon Valleys inventions were given voice by Peter Thiel a man who has made billions financing the creation of the technologies whose paucity he then bemoaned.

We are no longer living in a technologically accelerating world, Thiel told an audience at Yale University in 2013. There is an incredible sense of deceleration.

In the six years since Thiel spoke to that audience, the only acceleration has been the pace of technologys contribution to the worlds decline.

However, there are some investors who think that the next wave of big technological breakthroughs are just around the corner and that 2020 will be the year that they enter the public consciousness in a real way.

These are the venture capitalists who invest in companies that develop so-called frontier technologies (or deep tech) things like computational biology, artificial intelligence or machine learning, robotics, the space industry, advanced manufacturing using 3D printing, and quantum computing.

Continued advancements in computational power, data management, imaging and sensing technologies, and materials science are bridging researchers ability to observe and understand phenomena with the potential to manipulate them in commercially viable ways.

As a result increasing numbers of technology investors are seeing less risk and more rewards in the formerly arcane areas of investing in innovations.

Established funds will spin up deep tech teams and more funds will be founded to address this market, especially where deep tech meets sustainability, according to Fifty Years investor, Seth Bannon. This shift will be driven from the bottom up (its where the best founder talent is heading) and also from the top down (as more and more institutional LPs want to allocate capital to this space).

In some ways, these investments are going to be driven by political necessity as much as technological advancement, according to Matt Ocko, a managing partner at the venture firm DCVC.

Earlier this year, DCVC closed on $725 million for two investment funds focused on deep technology investing. For Ocko, the geopolitical reality of continuing tensions with China will drive adoption of new technologies that will remake the American industrial economy.

Whether we like it or not, US-government-driven scrutiny of China-based technology will continue in 2020. Less of it will be allowed to be deployed in the US, especially in areas of security, networking, autonomous transportation and space intelligence, writes Ocko, in an email. At the same time, US DoD efforts to streamline procurement processes will result in increasingly tighter partnerships between the DoD and tech sector. The need to bring complex manufacturing, comms, and semiconductor technology home to the US will support a renaissance in distributed manufacturing/advanced manufacturing tech and a strong wave of semiconductor and robotic innovation.

Original post:

2020 will be the beginning of the tech industry's radical revisioning of the physical world - TechCrunch

How quantum computing could beat climate change – World Economic Forum

Imagine being able to cheaply and easily suck carbon directly out of our atmosphere. Such a capability would be hugely powerful in the fight against climate change and advance us towards the ambitious global climate goals set.

Surely thats science fiction? Well, maybe not. Quantum computing may be just the tool we need to design such a clean, safe and easy-to-deploy innovation.

In 1995 I first learned that quantum computing might bring about a revolution akin to the agricultural, industrial and digital ones weve already had. Back then it seemed far-fetched that quantum mechanics could be harnessed to such momentous effect; given recent events, it seems much, much more likely.

Much excitement followed Googles recent announcement of quantum supremacy: [T]he point where quantum computers can do things that classical computers cant, regardless of whether those tasks are useful.

The question now is whether we can develop the large-scale, error-corrected quantum computers that are required to realize profoundly useful applications.

The good news is we already concretely know how to use such fully-fledged quantum computers for many important tasks across science and technology. One such task is the simulation of molecules to determine their properties, interactions, and reactions with other molecules a.k.a. chemistry the very essence of the material world we live in.

While simulating molecules may seem like an esoteric pastime for scientists, it does, in fact, underpin almost every aspect of the world and our activity in it. Understanding their properties unlocks powerful new pharmaceuticals, batteries, clean-energy devices and even innovations for carbon capture.

To date, we havent found a way to simulate large complex molecules with conventional computers, we never will, because the problem is one that grows exponentially with the size or complexity of the molecules being simulated. Crudely speaking, if simulating a molecule with 10 atoms takes a minute, a molecule with 11 takes two minutes, one with 12 atoms takes four minutes and so on. This exponential scaling quickly renders a traditional computer useless: simulating a molecule with just 70 atoms would take longer than the lifetime of the universe (13 billion years).

This is infuriating, not just because we cant simulate existing important molecules that we find (and use) in nature including within our own body and thereby understand their behaviour; but also because there is an infinite number of new molecules that we could design for new applications.

Thats where quantum computers could come to our rescue, thanks to the late, great physicist Richard Feynman. Back in 1981, he recognized that quantum computers could do that which would be impossible for classical computers when it comes to simulating molecules. Thanks to recent work by Microsoft and others we now have concrete recipes for performing these simulations.

One area of urgent practical importance where quantum simulation could be hugely valuable is in meeting the SDGs not only in health, energy, industry, innovation and infrastructure but also in climate action. Examples include room-temperature superconductors (that could reduce the 10% of energy production lost in transmission), more efficient processes to produce nitrogen-based fertilizers that feed the worlds population and new, far more efficient batteries.

One very powerful application of molecular simulation is in the design of new catalysts that speed up chemical reactions. It is estimated that 90% of all commercially produced chemical products involve catalysts (in living systems, theyre called enzymes).

Annual CO2 emissions globally in 2017

A catalyst for scrubbing carbon dioxide directly from the atmosphere could be a powerful tool in tackling climate change. Although CO2 is captured naturally, by oceans and trees, CO2 production has exceeded these natural capture rates for many decades.

The best way to tackle CO2 is not releasing more CO2; the next best thing is capturing it. While we cant literally turn back time, [it] is a bit like rewinding the emissions clock, according to Torben Daeneke at RMIT University.

There are known catalysts for carbon capture but most contain expensive precious metals or are difficult or expensive to produce and/or deploy. We currently dont know many cheap and readily available catalysts for CO2 reduction, says Ulf-Peter Apfel of Ruhr-University Bochum.

Given the infinite number of candidate molecules that are available, we are right to be optimistic that there is a catalyst (or indeed many) to be found that will do the job cheaply and easily. Finding such a catalyst, however, is a daunting task without the ability to simulate the properties of candidate molecules.

And thats where quantum computing could help.

We might even find a cheap catalyst that enables efficient carbon dioxide recycling and produces useful by-products like hydrogen (a fuel) or carbon monoxide (a common source material in the chemical industry).

We can currently simulate small molecules on prototype quantum computers with up to a few dozen qubits (the quantum equivalent of classical computer bits). But scaling this to useful tasks, like discovering new CO2 catalysts, will require error correction and simulation to the order of 1 million qubits.

Its a challenge I have long believed will only be met on any human timescale certainly by the 2030 target for the SDGs if we use the existing manufacturing capability of the silicon chip industry.

At a meeting of the World Economic Forums Global Future Councils last month a team of experts from across industry, academia and beyond assembled to discuss how quantum computing can help address global challenges, as highlighted by the SDGs, and climate in particular.

As co-chair of the Global Future Council on Quantum Computing, I was excited that we were unanimous in agreeing that the world should devote more resources, including in education, to developing the powerful quantum computing capability that could help tackle climate change, meet the SDGs more widely and much more. We enthusiastically called for more international cooperation to develop this important technology on the 2030 timescale to have an impact on delivering the SDGs, in particular climate.

So the real question for me is: can we do it in time? Will we make sufficiently powerful quantum computers on that timeframe? I believe so. There are, of course, many other things we can and should do to tackle climate change, but developing large-scale, error-corrected quantum computers is a hedge we cannot afford to go without.

License and Republishing

World Economic Forum articles may be republished in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Here is the original post:

How quantum computing could beat climate change - World Economic Forum

Quantum computing leaps ahead in 2019 with new power and speed – CNET

A close-up view of the IBM Q quantum computer. The processor is in the silver-colored cylinder.

Quantum computers are getting a lot more real. No, you won't be playing Call of Duty on one anytime soon. But Google, Amazon, Microsoft, Rigetti Computing and IBM all made important advances in 2019 that could help bring computers governed by the weird laws of atomic-scale physics into your life in other ways.

Google's declaration of quantum supremacywas the most headline-grabbing moment in the field. The achievement -- more limited than the grand term might suggest -- demonstrated that quantum computers could someday tackle computing problems beyond the reach of conventional "classical" computers.

Proving quantum computing progress is crucial. We're still several breakthroughs away from realizing the full vision of quantum computing. Qubits, the tiny stores of data that quantum computers use, need to be improved. So do the finicky control systems used to program and read quantum computer results. Still, today's results help justify tomorrow's research funding to sustain the technology when the flashes of hype inevitably fizzle.

Now playing: Watch this: Quantum computing is the new super supercomputer

4:11

Quantum computers will live in data centers, not on your desk, when they're commercialized. They'll still be able to improve many aspects of your life, though. Money in your retirement account might grow a little faster and your packages might be delivered a little sooner as quantum computers find new ways to optimize businesses. Your electric-car battery might be a little lighter and new drugs might help you live a little longer after quantum computers unlock new molecular-level designs. Traffic may be a little lighter from better simulations.

But Google's quantum supremacy step was just one of many needed to fulfill quantum computing's promise.

"We're going to get there in cycles. We're going to have a lot of dark ages in which nothing happens for a long time," said Forrester analyst Brian Hopkins. "One day that new thing will really change the world."

Among the developments in 2019:

Classical computers, which include everything from today's smartwatches to supercomputers that occupy entire buildings, store data as bits that represent either a 1 or a 0. Quantum computers use a different approach called qubits that can represent a combination of 1 and 0 through an idea called superposition.

Ford and Microsoft adapted a quantum computing traffic simulation to run on a classical computer. The result: a traffic routing algorithm that could cut Seattle traffic congestion by 73%.

The states of multiple qubits can be linked, letting quantum computers explore lots of possible solutions to a problem at once. With each new qubit added, a quantum computer can explore double the number of possible solutions, an exponential increase not possible with classical machines.

Quantum computers, however, are finicky. It's hard to get qubits to remain stable long enough to return useful results. The act of communicating with qubits can perturb them. Engineers hope to add error correction techniques so quantum computers can tackle a much broader range of problems.

Plenty of people are quantum computing skeptics. Even some fans of the technology acknowledge we're years away from high-powered quantum computers. But already, quantum computing is a real business. Samsung, Daimler, Honda, JP Morgan Chase and Barclays are all quantum computing customers. Spending on quantum computers should reach hundreds of millions of dollars in the 2020s, and tens of billions in the 2030s, according to forecasts by Deloitte, a consultancy. China, Europe, the United States and Japan have sunk billions of dollars into investment plans. Ford and Microsoft say traffic simulation technology for quantum computers, adapted to run on classical machines, already is showing utility.

Right now quantum computers are used mostly in research. But applications with mainstream results are likely coming. The power of quantum computers is expected to allow for the creation of new materials, chemical processes and medicines by giving insight into the physics of molecules. Quantum computers will also help for greater optimization of financial investments, delivery routes and flights by crunching the numbers in situations with a large number of possible courses of action.

They'll also be used for cracking today's encryption, an idea spy agencies love, even if you might be concerned about losing your privacy or some snoop getting your password. New cryptography adapted for a quantum computing future is already underway.

Another promising application is artificial intelligence, though that may be years in the future.

"Eventually we'll be able to reinvent machine learning," Forrester's Hopkinssaid. But it'll take years of steady work in quantum computing beyond the progress of 2019. "The transformative benefits are real and big, but they are still more sci-fi and theory than they are reality."

Read the rest here:

Quantum computing leaps ahead in 2019 with new power and speed - CNET

Quantum computing will be the smartphone of the 2020s, says Bank of America strategist – MarketWatch

When asked what invention will be as revolutionary in the 2020s as smartphones were in the 2010s, Bank of America strategist Haim Isreal said, without hesitation, quantum computing.

At the banks annual year ahead event last week in New York, Israel qualified his prediction, arguing in an interview with MarketWatch that the timing of the smartphones arrival on the scene in the mid-2000s, and its massive impact on the American business landscape in the 2010s, doesnt line up neatly with quantum-computing breakthroughs, which are only now being seen, just a few weeks before the start of the 2020s.

The iPhone already debuted in 2007, enabling its real impact to be felt in the 2010s, he said, while the first business applications for quantum computing won't be seen till toward the end of the coming decade.

But, Israel argued, when all is said and done, quantum computing could be an even more radical technology in terms of its impact on businesses than the smartphone has been. This is going to be a revolution, he said.

Quantum computing is a nascent technology based on quantum theory in physics which explains the behavior of particles at the subatomic level, and states that until observed these particles can exist in different places at the same time. While normal computers store information in ones and zeros, quantum computers are not limited by the binary nature of current data processing and so can provide exponentially more computing power.

Quantum things can be in multiple places at the same time, said Chris Monroe, a University of Maryland physicist and founder of IonQ told the Associated Press . The rules are very simple, theyre just confounding.

In October, Alphabet Inc. GOOG, -0.44% subsidiary Google claimed to have achieved a breakthrough by using a quantum computer to complete a calculation in 200 seconds on a 53-qubit quantum computing chip, a task it calculated would take the fastest current super-computer 10,000 years. Earlier this month, Amazon.com Inc. AMZN, +1.21% announced its intention to collaborate with experts to develop quantum computing technologies that can be used in conjunction with its cloud computing services. International Business Machines Corp. IBM, +0.07% and Microsoft Corp. MSFT, -0.54% are also developing quantum computing technology.

Israel argued these tools will revolutionize several industries, including health care, the internet of things and cyber security. He said that pharmaceutical companies are most likely to be the first commercial users of these devices, given the explosion of data created by health care research.

Pharma companies are right now subject to Moores law in reverse, he said. They are seeing the cost of drug development doubling every nine years, as the amount of data on the human body becomes ever more onerous to process. Data on genomics doubles every 50 days, he added, arguing that only quantum computers will be able to solve the pharmaceutical industrys big-data problem.

Quantum computing will also have a major impact on cybersecurity, an issue that effects nearly every major corporation today. Currently cyber security relies on cryptographic algorithms, but quantum computings ability to solve these equations in the fraction of the time a normal computer does will render current cyber security methods obsolete.

In the future, even robust cryptographic algorithms will be substantially weakened by quantum computing, while others will no longer be secure at all, according to Swaroop Sham, senior product marketing manager at Okta.

For investors, Israel said, it is key to realize that the first one or two companies to develop commercially applicable quantum-computing will be richly rewarded with access to untold amounts of data and that will only make their software services more valuable to potential customers in a virtuous circle.

What weve learned this decade is that whoever controls the data will win big time, he said.

Read the original post:

Quantum computing will be the smartphone of the 2020s, says Bank of America strategist - MarketWatch

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing – VentureBeat

Quantum is having a moment. In October, Google claimed to have achieved a quantum supremacy milestone. In November, Microsoft announced Azure Quantum, a cloud service that lets you tap into quantum hardware providers Honeywell, IonQ, or QCI. Last week, AWS announced Amazon Braket, a cloud service that lets you tap into quantum hardware providers D-Wave, IonQ, and Rigetti. At the Q2B 2019 quantum computing conference this week, I got a pulse for how the nascent industry is feeling.

Binary digits (bits) are the basic units of information in classical computing, while quantum bits (qubits) make up quantum computing. Bits are always in a state of 0 or 1, while qubits can be in a state of 0, 1, or a superposition of the two. Quantum computing leverages qubits to perform computations that would be much more difficult for a classical computer. Potential applications are so vast and wide (from basic optimization problems to machine learning to all sorts of modeling) that interested industries span finance, chemistry, aerospace, cryptography, and more. But its still so early that the industry is nowhere close to reaching consensus on what the transistor for qubits should look like.

Currently, your cloud quantum computing options are limited to single hardware providers, such as those from D-Wave and IBM. Amazon and Microsoft want to change that.

Enterprises and researchers interested in testing and experimenting with quantum are excited because they will be able to use different quantum processors via the same service, at least in theory. Theyre uneasy, however, because the quantum processors are so fundamentally different that its not clear how easy it will be to switch between them. D-Wave uses quantum annealing, Honeywell and IonQ use ion trap devices, and Rigetti and QCI use superconducting chips. Even the technologies that are the same have completely different architectures.

Entrepreneurs and enthusiasts are hopeful that Amazon and Microsoft will make it easier to interface with the various quantum hardware technologies. Theyre uneasy, however, because Amazon and Microsoft have not shared pricing and technical details. Plus, some of the quantum providers offer their own cloud services, so it will be difficult to suss out when it makes more sense to work with them directly.

The hardware providers themselves are excited because they get exposure to massive customer bases. Amazon and Microsoft are the worlds biggest and second biggest cloud providers, respectively. Theyre uneasy, however, because the tech giants are really just middlemen, which of course poses its own problems of costs and reliance.

At least right now, it looks like this will be the new normal. Even hardware providers that havent announced they are partnering with Amazon and/or Microsoft, like Xanadu, are in talks to do just that.

Overall at the event, excitement trumped uneasiness. If youre participating in a domain as nascent as quantum, you must be optimistic. The news this quarter all happened very quickly, but there is still a long road ahead. After all, these cloud services have only been announced. They still have to become available, gain exposure, pick up traction, become practical, prove useful, and so on.

The devil is in the details. How much are these cloud services for quantum going to cost? Amazon and Microsoft havent said. When exactly will they be available in preview or in beta? Amazon and Microsoft havent said. How will switching between different quantum processors work in practice? Amazon and Microsoft havent said.

One thing is clear. Everyone at the event was talking about the impact of the two biggest cloud providers offering quantum hardware from different companies. The clear winners? Amazon and Microsoft.

ProBeat is a column in which Emil rants about whatever crosses him that week.

Read the rest here:

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat

Could quantum computing be the key to cracking congestion? – SmartCitiesWorld

The technology has helped to improve congestion by 73 per cent in scenario-testing

Ford and Microsoft are using quantum-inspired computing technology to reduce traffic congestion. Through a joint research pilot, scientists have used the technology to simulate thousands of vehicles and their impact on congestion in the US city of Seattle.

Ford said it is still early in the project but encouraging progress has been made and it is further expanding its partnership with the tech giant.

The companies teamed up in 2018 to develop new quantum approaches running on classical computers already available to help reduce Seattles traffic congestion.

Writing on a blog post on Medium.com, Dr Ken Washington, chief technology officer, Ford Motor Company, explained that during rush hour, numerous drivers request the shortest possible routes at the same time, but current navigation services handle these requests "in a vacuum": They do not take into consideration the number of similar incoming requests, including areas where other drivers are all planning to share the same route segments, when delivering results.

What is required is a more balanced routing system that could manage all the various route requests from drivers and provide optimised route suggestions, reducing the number of vehicles on a particular road.

These and more are all variables well need to test for to ensure balanced routing can truly deliver tangible improvements for cities.

Traditional computers dont have the computational power to do this but, as Washington explained, in a quantum computer, information is processed by a quantum bit (or a qubit) and can simultaneously exist "in two different states" before it gets measured.

This ultimately enables a quantum computer to process information with a faster speed, he wrote. Attempts to simulate some specific features of a quantum computer on non-quantum hardware have led to quantum-inspired technology powerful algorithms that mimic certain quantum behaviours and run on specialised conventional hardware. That enables organisations to start realising some benefits before fully scaled quantum hardware becomes available."

Working with Microsoft, Ford tested several different possibilities, including a scenario involving as many as 5,000 vehicles each with 10 different route choices available to them simultaneously requesting routes across Metro Seattle. It reports that in 20 seconds, balanced routing suggestions were delivered to the vehicles that resulted in a 73 per cent improvement in total congestion when compared to selfish routing.

The average commute time, meanwhile, was also cut by eight per cent representing an annual reduction of more than 55,000 hours across this simulated fleet.

Based on these results, Ford is expanding its partnership with Microsoft to further improve the algorithm and understand its effectiveness in more real-world scenarios.

For example, will this method still deliver similar results when some streets are known to be closed, if route options arent equal for all drivers, or if some drivers decide to not follow suggested routes? wrote Washington. These and more are all variables well need to test for to ensure balanced routing can truly deliver tangible improvements for cities.

You might also like:

Read more:

Could quantum computing be the key to cracking congestion? - SmartCitiesWorld

What WON’T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few – Business Wire

OYSTER BAY, N.Y.--(BUSINESS WIRE)--As 2019 winds down, predictions abound on the technology advancements and innovations expected in the year ahead. However, there are several anticipated advancements, including 5G wearables, quantum computing, and self-driving trucks, that will NOT happen in the first year of the new decade, states global tech market advisory firm, ABI Research.

In its new whitepaper, 54 Technology Trends to Watch in 2020, ABI Researchs analysts have identified 35 trends that will shape the technology market and 19 others that, although attracting huge amounts of speculation and commentary, look less likely to move the needle over the next twelve months. After a tumultuous 2019 that was beset by many challenges, both integral to technology markets and derived from global market dynamics, 2020 looks set to be equally challenging, says Stuart Carlaw, Chief Research Officer at ABI Research. Knowing what wont happen in technology in the next year is important for end users, implementors, and vendors to properly place their investments or focus their strategies.

What wont happen in 2020?

5G Wearables: While smartphones will dominate the 5G market in 2020, 5G wearables wont arrive in 2020, or anytime soon, says Stephanie Tomsett, 5G Devices, Smartphones & Wearables analyst at ABI Research. To bring 5G to wearables, specific 5G chipsets will need to be designed and components will need to be reconfigured to fit in the small form factor. That wont begin to happen until 2024, at the earliest.

Quantum Computing: Despite claims from Google in achieving quantum supremacy, the tech industry is still far away from the democratization of quantum computing technology, says Lian Jye Su, AI & Machine Learning Principal Analyst at ABI Research. Quantum computing is definitely not even remotely close to the large-scale commercial deployment stage.

Self-Driving Trucks: Despite numerous headlines declaring the arrival of driverless, self-driving, or robot vehicles, very little, if any, driver-free commercial usage is underway beyond closed-course operations in the United States, says Susan Beardslee, Freight Transportation & Logistics Principal Analyst at ABI Research.

A Consolidated IoT Platform Market: For many years, there have been predictions that the IoT platform supplier market will begin to consolidate, and it just wont happen, says Dan Shey, Vice President of Enabling Platforms at ABI Research. The simple reason is that there are more than 100 companies that offer device-to-cloud IoT platform services and for every one that is acquired, there are always new ones that come to market.

Edge Will Not Overtake Cloud: The accelerated growth of the edge technology and intelligent device paradigm created one of the largest industry misconceptions: edge technology will cannibalize cloud technology, says Kateryna Dubrova, M2M, IoT & IoE Analyst at ABI Research. In fact, in the future, we will see a rapid development of edge-cloud-fog continuum, where technology will complement each other, rather than cross-cannibalize.

8K TVs: Announcements of 8K Television (TV) sets by major vendors earlier in 2019 attracted much attention and raised many of questions within the industry, says Khin Sandi Lynn, Video & Cloud Services Analyst at ABI Research. The fact is, 8K content is not available and the price of 8K TV sets are exorbitant. The transition from high definition (HD) to 4K will continue in 2020 with very limited 8K shipments less than 1 million worldwide.

For more trends that wont happen in 2020, and the 35 trends that will, download the 54 Technology Trends to Watch in 2020 whitepaper.

About ABI Research

ABI Research provides strategic guidance to visionaries, delivering actionable intelligence on the transformative technologies that are dramatically reshaping industries, economies, and workforces across the world. ABI Researchs global team of analysts publish groundbreaking studies often years ahead of other technology advisory firms, empowering our clients to stay ahead of their markets and their competitors.

For more information about ABI Researchs services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific or visit http://www.abiresearch.com.

Excerpt from:

What WON'T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few - Business Wire

Will quantum computing overwhelm existing security tech in the near future? – Help Net Security

More than half (54%) of cybersecurity professionals have expressed concerns that quantum computing will outpace the development of other security tech, according to a research from Neustar.

Keeping a watchful eye on developments, 74% of organizations admitted to paying close attention to the technologys evolution, with 21% already experimenting with their own quantum computing strategies.

A further 35% of experts claimed to be in the process of developing a quantum strategy, while just 16% said they were not yet thinking about it. This shift in focus comes as the vast majority (73%) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years.

Almost all respondents (93%) believe the next-generation computers will overwhelm existing security technology, with just 7% under the impression that true quantum supremacy will never happen.

Despite expressing concerns that other technologies will be overshadowed, 87% of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13% were more cautious and under the impression that the technology would create more harm than good.

At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years, said Rodney Joffe, Chairman of NISC and Security CTO at Neustar.

Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyberattack unlike anything weve ever seen.

For both todays major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately.

The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organization holding sensitive data should have quantum on their radar.

Quantum computings ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cybersecurity, added Joffe.

The report also highlighted a steep two-year increase on the International Cyber Benchmarks Index. Calculated based on changes in the cybersecurity landscape including the impact of cyberattacks and changing level of threat November 2019 saw the highest score yet at 28.2. In November 2017, the benchmark sat at just 10.1, demonstrating an 18-point increase over the last couple of years.

During September October 2019, security professionals ranked system compromise as the greatest threat to their organizations (22%), with DDoS attacks and ransomware following very closely behind (21%).

Go here to read the rest:

Will quantum computing overwhelm existing security tech in the near future? - Help Net Security

Quantum expert Robert Sutor explains the basics of Quantum Computing – Packt Hub

What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

Now for the quantum version, Instead of R, I use H. It too returns 0 or 1 with equal chance, but it has two interesting properties.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

Visit link:

Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

What Was The Most Important Physics Of 2019? – Forbes

So, Ive been doing a bunch of talking in terms of decades in the last couple of posts, about the physics defining eras in the 20th century and the physics defining the last couple of decades. Ill most likely do another decadal post in the near future, this one looking ahead to the 2020s, but the end of a decade by definition falls at the end of a year, so its worth taking a look at physics stories on a shorter time scale, as well.

New year 2019 change to 2020 concept, hand change wooden cubes

You can, as always, find a good list of important physics stories in Physics Worlds Breakthrough of the Year shortlist, and there are plenty of other top science stories of 2019 lists out there. Speaking for myself, this is kind of an unusual year, and its tough to make a call as to the top story. Most of the time, these end-of-year things are either stupidly obvious because one story towers above all the others, or totally subjective because there are a whole bunch of stories of roughly equal importance, and the choice of a single one comes down to personal taste.

In 2019, though, I think there were two stories that are head-and-shoulders above everything else, but roughly equal to each other. Both are the culmination of many years of work, and both can also claim to be kicking off a new era for their respective subfields. And Im really not sure how to choose between them.

US computer scientist Katherine Bouman speaks during a House Committee on Science, Space and ... [+] Technology hearing on the "Event Horizon Telescope: The Black hole seen Round the World" in the Rayburn House office building in Washington, DC on May 16, 2019. (Photo by Andrew CABALLERO-REYNOLDS / AFP) (Photo credit should read ANDREW CABALLERO-REYNOLDS/AFP via Getty Images)

The first of these is the more photogenic of the two, namely the release of the first image of a black hole by the Event Horizon Telescope collaboration back in April. This one made major news all over, and was one of the experiments that led me to call the 2010s the decade of black holes.

As I wrote around the time of the release, this was very much of a piece with the preceding hundred years of tests of general relativity: while many stories referred to the image as a shadow of the black hole, really its a ring produced by light bending around the event horizon. This is the same basic phenomenon that Eddington measured in 1919 looking at the shift in the apparent position of stars near the Sun, providing confirmation of Einsteins prediction that gravity bends light. Its just that scaling up the mass a few million times produces a far more dramatic bending of spacetime (and thus light) than the gentle curve produced by our Sun.

This Feb. 27, 2018, photo shows electronics for use in a quantum computer in the quantum computing ... [+] lab at the IBM Thomas J. Watson Research Center in Yorktown Heights, N.Y. Describing the inner workings of a quantum computer isnt easy, even for top scholars. Thats because the machines process information at the scale of elementary particles such as electrons and photons, where different laws of physics apply. (AP Photo/Seth Wenig)

The other story, in very 2019 fashion, first emerged via a leak: someone at NASA accidentally posted a draft of the paper in which Googles team claimed to have achieved quantum supremacy. They demonstrated reasonably convincingly that their machine took about three and a half minutes to generate a solution to a particular problem that would take vastly longer to solve with a classical computer.

The problem they were working with was very much in the quantum simulation mode that I talked about a year earlier, when I did a high-level overview of quantum computing in general, though a singularly useless version of that. Basically, they took a set of 50-odd qubits and performed a random series of operations on them to put them in a complicated state in which each qubit was in a superposition of multiple states and also entangled with other qubits in the system. Then they measured the probability of finding specific output states.

Qubit, or quantum bit, illustration. The qubit is a unit of quantum information. As a two-state ... [+] system with superposition of both states at the same time, it is fundamental to quantum computing. The illustration shows the Bloch sphere. The north pole is equivalent to one, the south pole to zero. The other locations, anywhere on the surface of the sphere, are quantum superpositions of 0 and 1. When the qubit is measured, the quantum wave function collapses, resulting in an ordinary bit - a one or a zero - which effectively depends on the qubit's 'latitude'. The illustration shows the qubit 'emitting' a stream of wave functions (the Greek letter psi), representing the collapse of the wave function when measured.

Finding the exact distribution of possible outcomes for such a large and entangled system is extremely computationally intensive if youre using a classical computer to do the job, but it happens very naturally in the quantum computer. So they could get a good approximation of the distribution within minutes, while the classical version would take a lot more time, where a lot more time ranges from thousands of years (Googles claim) down to a few days (the claim by a rival group at IBM using a different supercomputer algorithm to run the computation). If youd like a lot more technical detail about what this did and didnt do, see Scott Aaronson.

As with the EHT paper, this is the culmination of years of work by a large team of people. Its also very much of a piece with past work quantum computing as a distinct field is a recent development, but really, the fundamental equations used to do the calculations were pretty well set by 1935.

Glowing new technology in deep space, computer generated abstract background, 3D rendering

Both of these projects also have a solid claim to be at the forefront of something new. The EHT image is the first to be produced, but wont be the last theyre crunching numbers on the Sag A* black hole at the center of the Milky Way, and theres room to improve their imaging in the future. Along with the LIGO discovery from a few years ago, this is the start of a new era of looking directly at black holes, rather than just using them as a playground for theory.

Googles demonstration of quantum supremacy, meanwhile, is the first such result in a highly competitive field: IBM and Microsoft are also invested in similar machines, and there are smaller companies and academic labs exploring other technologies. The random-sampling problem they used is convenient for this sort of demonstration, but not really useful for anything else, but lots of people are hard at work on techniques to make a next generation of machines that will be able to do calculations where people care about the answer. Theres a good long way to go, yet, but a lot of activity in the field driving things forward.

So, in the head-to-head matchup for Top Physics Story of 2019, these two are remarkably evenly matched, and it could really go either way. The EHT result has a slightly deeper history, the Google quantum computer arguably has a brighter future. My inclination would be to split the award between them; if you put a gun to my head and made me pick one, Id go with quantum supremacy, but Id seriously question the life choices that led you to this place, because theyre both awesome accomplishments that deserve to be celebrated.

Read the original here:

What Was The Most Important Physics Of 2019? - Forbes

Quantum Technology Expert to Discuss Quantum Sensors for Defense Applications at Office of Naval Research (ONR) – Business Wire

ARLINGTON, Va.--(BUSINESS WIRE)--Michael J. Biercuk, founder and CEO of Q-CTRL, will describe how quantum sensors may provide exceptional new capabilities to the warfighter at the Office of Naval Research (ONR) on Jan. 13, 2020, as part of the ONRs 2020 Distinguished Lecture Series.

Quantum sensing is considered one of the most promising areas in the global research effort to leverage the exotic properties of quantum physics for real-world benefit. In his lecture titled Quantum Control as a Means to Improve Quantum Sensing in Realistic Environments, Biercuk will describe how new concepts in quantum control engineering applied to these sensors could dramatically enhance standoff detection and precision navigation and timing in military settings.

Biercuk is one of the worlds leading experts in the field of quantum technology. In 2017, he founded Q-CTRL based on research he led at the Quantum Control Lab at the University of Sydney, where he is a professor of Quantum Physics and Quantum Technology.

Funded by some of the worlds leading investors, including Silicon Valley-based Sierra Ventures and Sequoia Capital, Q-CTRL is dedicated to helping teams realize the true potential of quantum hardware, from sensing to quantum computing. In quantum computing, the team is known for its efforts in reducing hardware errors caused by environmental noise. Computational errors are considered a major obstacle in the development of useful quantum computers and sought-after breakthroughs in science and industry.

Now in its 11th year, the ONR Distinguished Lecture Series features groundbreaking innovators who have made a major impact on past research or are working on discoveries for the future. It is designed to stimulate discussion and collaboration among scientists and engineers representing Navy research, the Department of Defense, industry and academia.

Past speakers include Michael Posner, recipient of the National Medal of Science; Mark Hersam, MacArthur Genius Award recipient and leading experimentalist in the field of nanotechnology; and Dr. Robert Ballard, the deep-sea explorer best-known for recovering the wreck of the RMS Titanic.

I am honored to be taking part in this renowned lecture series, Biercuk said. Quantum technology, which harnesses quantum physics as a resource, is likely to be as transformational in the 21st century as harnessing electricity was in the 19th. I look forward to sharing insights into how Q-CTRLs efforts can accelerate the development of this new field of technology for defense applications.

About the Office of Naval Research

The Department of the Navys Office of Naval Research provides the science and technology necessary to maintain the Navy and Marine Corps technological advantage. Through its affiliates, ONR is a leader in science and technology with engagement in 50 states, 55 countries, 634 institutions of higher learning and nonprofit institutions, and more than 960 industry partners.

ABOUT Q-CTRL

Q-CTRL was founded in November 2017 and is a venture-capital-backed company that provides control-engineering software solutions to help customers harness the power of quantum physics in next-generation technologies.

Q-CTRL is built on Professor Michael J. Biercuks research leading the Quantum Control Lab at the University of Sydney, where he is a Professor of Quantum Physics and Quantum Technology.

The teams expertise led Q-CTRL to be selected as an inaugural member of the IBM Q startup network in 2018. Q-CTRL is funded by SquarePeg Capital, Sierra Ventures, Sequoia Capital China, Data Collective, Horizons Ventures and Main Sequence Ventures.

See more here:

Quantum Technology Expert to Discuss Quantum Sensors for Defense Applications at Office of Naval Research (ONR) - Business Wire

Shaping the technology transforming our society – Fermi National Accelerator Laboratory

Technology and society are intertwined. Self-driving cars and facial recognition technologies are no longer science fiction, and data and efficiency are harbingers of this new world.

But these new technologies are only the beginning. In the coming decades, further advances in artificial intelligence and the dawn of quantum computing are poised to change lives in both discernible and inconspicuous ways.

Even everyday technology, like a smartphone app, affects people in significant ways that they might not realize, said Fermilab scientist Daniel Bowring. If there are concerns about something as familiar as an app, then we need to take more opaque and complicated technology, like AI, very seriously.

A two-day workshop took place from Oct. 31-Nov.1 at the University of Chicago to raise awareness and generate strategies for the ethical development and implementation of AI and quantum computing. The workshop was organized by the Chicago Quantum Exchange, a Chicago-based intellectual hub and community of researchers whose aim is to promote the exploration of quantum information technologies, and funded by the Kavli Foundation and the Center for Data and Computing, a University of Chicago center for research driven by data science and AI approaches.

Members of the Chicago Quantum Exchange engage in conversation at a workshop at the University of Chicago. Photo: Anne Ryan, University of Chicago

At the workshop, industry experts, physicists, sociologists, journalists and more gathered to learn, share insights and identify next steps as AI and quantum computing advance.

AI and quantum computing are developing tools that will affect everyone, said Bowring, a member of the workshop organizing team. It was important to us to get as many stakeholders in the room as possible.

Workshop participants listened to presentations that framed concerns such as power asymmetries, algorithmic bias and privacy before breaking out into small groups to deliberate these topics and develop actionable strategies. Groups reported to all attendees after each breakout session. On the last day of the workshop, participants considered how they would nurture the dialogue.

At one of the breakout sessions, participants discussed the balance between collaborative quantum computing research and national security. Today, the results of quantum computing research are dispersed in a wide variety of academic journals, and a lot of code is accessible and open source. However, because of its potential implications for cybersecurity and encryption, quantum computing is also of interest to national security, so it may be subject to intelligence and export controls. What endeavors, if any, should be open source or private? Are these outcomes realizable? What level of control should be maintained? How should these technologies be regulated?

Were already behind on setting ground rules for these technologies, which, if left to progress on their own, could increase power asymmetries in society, said Brian Nord, Fermilab and University of Chicago scientist and member of the workshop organizing team. Our research programs, for example, need to be crafted in a way that does not reinforce or exacerbate these asymmetries.

Workshop participants will continue the dialogue through online and in-person meetings to address key ethical and societal issues in the quantum and AI space. Potential future activities include writing proposals for joint research projects that consider ethical and societal implications, white papers addressed to academic audiences, and media editorials and developing community action plans.

Organizers are planning to hold a panel next spring to engage the public, as well.

The spring event will help us continue to spread awareness and engage a variety of groups on issues of ethics in AI and quantum computing, Nord said.

The workshop was sponsored by the Kavli Foundation in partnership with the Center for Data and Computing at the University of Chicago. Artificial intelligence and quantum information science are two of six initiatives identified as special priority by the Department of Energy Office of Science.

The Kavli Foundation is dedicated to advancing science for the benefit of humanity, promoting public understanding of scientific research, and supporting scientists and their work. The foundations mission is implemented through an international program of research institutes, initiatives and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics, as well as the Kavli Prize and a program in public engagement with science. Visitkavlifoundation.org.

The Chicago Quantum Exchange catalyzes research activity across disciplines and member institutions. It is anchored by the University of Chicago, Argonne National Laboratory, Fermi National Accelerator Laboratory, and the University of Illinois at Urbana-Champaign and includes the University of Wisconsin-Madison, Northwestern University and industry partners. Visit chicagoquantum.org.

Original post:

Shaping the technology transforming our society - Fermi National Accelerator Laboratory

Where will the first big gains in quantum computing be? – Quantaneo, the Quantum Computing Source

Current quantum computers are far from where we need them to be for practical applications due to their high level of noise (errors). If we cannot find a way to use these current and near-term quantum computers, we will need to wait for fully-error-corrected universal machines to be developed to see real significant benefit (15-20 years by many estimates). This is where the software becomes much more than a necessary complement to the hardware. Quantum software has the potential to significantly accelerate our pathway to practically useful quantum computers. Quantum algorithms Most quantum algorithms developed to date cannot be run on near-term quantum computers, however there are some that can. One particular class of algorithm, variational quantum algorithms, is a lead contender for being able to demonstrate near-term quantum advantage. Variational quantum algorithms These algorithms allow users to change control parameters of the quantum computer until results match a target property, such as the energy of a molecule highly relevant to battery manufacturing, room temperature superconductivity, drug discovery and fertilizer manufacturing. Variational quantum algorithms have already been used to successfully simulate small chemical systems on quantum computers over the last two years, by our team at Rahko and a small handful of teams across the globe. Chemical Simulation Broadly speaking, in chemical simulation we look at two types of calculations: 1. Fast, low-cost, low-precision calculations that neglect exact quantum properties 2. High-precision, high-cost calculations Typically, the first type of calculation is used to filter large pools of candidates, such as candidate drugs. Once a pool has been filtered to a much smaller pool, the second type of calculation is performed to verify exact candidate properties. This mix allows an optimal use of computational resources. Quantum computing will likely not directly help with the first type of calculation (low-cost, low-precision), as quantum computing is inherently more expensive and slower. Machine learning (ML)-based approaches, however, do offer a speedup here. At Rahko, part of our work is in developing classical ML approaches to deliver faster classical solutions for this type of calculation. We can then use quantum computers to generate training data to improve classical ML algorithms. For the second type of calculation (high-cost, high-precision), quantum computers will bring far greater accuracy at reduced cost. Most importantly, quantum computers will be able to produce accurate simulations where classical methods fail. This will be a game-changing improvement when working with strongly correlated materials, which play a huge role in batteries and room temperature superconductivity. However, we still face the problem of noise in near-term machines. There is a solution: quantum machine learning. Quantum machine learning (QML) Over the past two years, ML based approaches to running quantum algorithms have borne out powerful results. Several algorithms have been proposed that, when combined, allow QML approaches to be 10,000,000 times faster than traditional variational quantum algorithms. This means that QML approaches will enable practical gains months, even years, before other variational quantum methods succeed. Our team at Rahko is working hard to deliver these gains for the past two years we have developed Hyrax, a QML platform that allows us to rapidly build, test and deploy QML algorithms. Hyrax relies heavily on variational quantum algorithms and powers all of our state-of-the-art research, helping us to push forward on a QML-enabled pathway to the first commercially valuable practical applications of quantum computing. With Hyrax, we aim to follow in the footsteps of world leading, UK-born quantum chemistry software, in the tradition of packages such as ONETEP and CASTEP. The UK quantum future I strongly believe that QML will play a key role in the UK quantum future. Investment in QML talent and ventures will give the UK an opportunity to uphold its leading role in quantum chemistry, and a lead role in global quantum computing at large. This piece was first published as a guest blog post on techUK as part of the techUK Quantum Future Campaign week.

Visit link:

Where will the first big gains in quantum computing be? - Quantaneo, the Quantum Computing Source

Woodside joins IBMs quantum computing network and flags further AI advances – Which-50

Oil and gas giant Woodside Energy announced a new collaboration with IBM to continue to advance its AI efforts and explore use cases for quantum computing.

As part of the collaboration Woodside will become a member of the MIT-IBM Watson AI Lab, which is a collaborative industrial-academic laboratory focused on advancing fundamental AI research.

Woodside is also the first commercial Australian organisation to join the IBM Q Network, a community of Fortune 500 companies, academic institutions, start-ups and national research labs working with IBM to advance quantum computing.

Woodside and IBM will use quantum computing to conduct deep computational simulations across the value chain of Woodsides business, the companies said.

Speaking at IBMs Cloud Innovation Exchange in Sydney yesterday, Woodside CEO Peter Coleman explained quantum computing could help with cybersecurity efforts to protect critical infrastructure as well as with the basic physics of what happens in our plant, particularly around flow assurance leading to more accurate predictions for the business operations.

We can see those things coming, and theyre coming very very rapidly. And I think those who are not already dealing with are going to get left behind, very quickly, Coleman said.

The announcement builds on Woodsides five-year relationship with IBM, centred largely around cognitive projects.

Looking back to 2013, Coleman said the company saw promising results from its data and analytics practice and wanted to make a big bet on AI.

Rather than do the easy stuff which is generally put AI in a call centre I said, weve got to go holistically at this and we will go straight into it as a company, he said.

The first use case the company selected was an AI system which responds to staff queries to surface the most relevant information from the companys corpus. There are now 25 million documents loaded in Watson and 80 per cent of employees use Watson on a daily basis, Coleman said.

Coleman flagged further AI use cases as the company embarks on its next wave of mega projects.Woodside is planning to spend US$30 billion on projects over the next six years and will use AI to identify materials and check if they match what has been ordered.

The CEO also expects AI to cut Woodsides US$1 billion maintenance bill by as much as 30 per cent by using AI to identify insulated cladding which has corroded.

Woodside is also working to build a cognitive plant that is able to operate itself, with assistance from NASA.

Commenting on the partnership, IBM CEO Ginni Rometty said, IBM is excited to join with Woodside, one of our first Watson clients globally, to help enable their pioneering vision of developing an intelligent plant.

Together, Woodside and IBM will push the frontiers of innovation, working with the worlds most advanced researchers in quantum computing and next generation AI.

Read more from the original source:

Woodside joins IBMs quantum computing network and flags further AI advances - Which-50


12345...