Top 5: Scientific Breakthroughs That Made 2019 an Unforgettable Year of Human Progress – The Weather Channel

Facial reconstruction of A. anamensis by John Gurche using 38-lakh-year-old (3.8-million-year-ago) hominin cranium.

From discovering cures for life-threatening diseases to exploring outer space, from unearthing new facts about human history to making incredible strides in artificial intelligence, humanity achieved exceptional breakthroughs in the field of science and technology in 2019.

As the year comes to an end, it is time to look back at some of those glorious scientific revolutions that will shape our future. Here are our picks for the most significant scientific advancements of 2019:

5. Hello Sun? Earthlings are going beyond your influence!

A simulated landing process of Chang'e-4 lunar probe at the Beijing Aerospace Control Center on Jan. 3, 2019.

Launched in January 2006, the interplanetary space probe New Horizons from the US space agency NASA steered past the Kuiper Belt object 486958 Arrokoth (then nicknamed Ultima Thule) on January 1, 2019. The Kuiper Belt is the region beyond the known planetary system of solar system, and this was the farthest flyby ever conducted by any human-made spacecraft.

Also this year, on November 4, NASA's Voyager 2 reached the interstellar mediuma space between star systems, well beyond the influence of our solar system. Voyager 1 had earlier achieved this feat in 2012. Voyager 2, its successor, was launched in the year 1977.

Also, China's moon mission, Chang'e 4, successfully made a soft landing on the far side of the Moonbecoming the first ever mission to do so. Named after the Chinese moon goddess, the mission is attempting to determine the age and composition of the Moon's unexplored region.

4. Quantum leap in computing

Representational image

Of all the progress made in computing research in 2019, the biggest breakthrough was perhaps the realisation of quantum computing.

Right in the first month of 2019, technology giant IBM unveiled Q System Onethe first quantum computer outside a research labbringing a rather abstract concept into the public imagination. Unlike the bits of information in computers we use, a quantum computer uses quantum bits, or qubits, enabling an exponential rise in the amount of data it can process and store.

Also Read: Rewind 2019: A Look Back at Significant Developments in Indian Science This Year

Further, a team of researchers from Australia and Singapore developed a quantum-powered machine that can accurately simulate future outcomes arising from different set of alternatives. Meanwhile, another study at Yale University showed that we can catch a qubit between the quantum jump and alter its outcomes. This was an exponential jump in fine-tuning the quantum systems as the outcomes need not be completely random and abrupt.

While other research also helped in conceptualising quantum drives with immense storage capacity, the biggest news was from Google. The search giant confirmed in October that it had achieved quantum supremacy. To put things in perspective, researchers at Google claim that the quantum computer solved in three minutes a problem that would have taken 10,000 years even for a supercomputer.

3. Revolutionary research in medical science

Representational image

Medical researchers are always striving to push the envelope of human resilience and efficiency. The year 2019 saw progress on both these fronts, with the development of potential cures for multiple life-threatening diseases and gene-editing promising to be more effective than ever.

This year, twin drugs were developed for Ebola and were found to be effective in nearly 90% of the cases, making the seemingly incurable condition treatable. Researchers also discovered potential cures for bubble boy disease, a condition where babies are born without disease-fighting immune cells, for cystic fibrosis, a painful, debilitating lung disease, as well as for pancreatic cancer.

Moreover, after decades, HIV research finally yielded some fruitful results this year with patients positively responding to treatments. After a long gap of 12 years from the day the first patient was cured of HIV infection that causes AIDS, another patient was cured in March 2019. Researchers had been relentlessly trying to replicate the treatment that cured the infection for the first time in 2007.

Furthermore, using CRISPR gene-editing technology, scientists have found potential treatments for cancer patients, even those with whom the standard procedure was not successful. In October, researchers produced scientific evidence that new gene-editing technology has the potential to correct up to 89% of genetic defects like sickle cell anaemia.

2. Imaging the faraway invisible wonder

Image of the black hole at the center of galaxy M87

Named the top scientific breakthrough of 2019 by the journal Science, this incredible photograph of a black hole was taken using eight radio telescopes around the world to form a virtual instrument that is said to be the size of the Earth itself.

The first-ever image of a black hole, released on April 10 this year, was taken by the Event Horizon Telescope (EHT) collaboration team. The gravity of a black hole is so strong that even light cannot escape its pull, and to capture an image of something that does not emit light is no easy task.

EHT imaged the silhouette (or shadow) of a massive black hole called M87 which is located at the centre of a galaxy 55 million light-years from Earth. M87 has enormous masswhopping 6500 million times the mass of the Sun. The image shows a ring of light coming from the gas falling into the event horizon (the boundary from beyond which nothing can escape) of the black hole.

1. Retracing the origins of humans

Craniofacial reconstruction process of Rakhigarhi cemetery individuals (BR02 and BR36).

Humankinds fascination with the question 'Where did we come from?' has persisted over centuries. Yet, some of the biggest breakthroughs in answering this question were made this year, starting with the discovery of a previously-unknown species of ancient humans. Named Homo luzonensis, this small-bodied bipedal species was discovered in the Philippines and is said to have lived on the island of Luzon 50,000 to 67,000 years ago.

In May, researchers deciphered a four-decade old mystery by identifying a 160,000-year-old human jawbone found in the Tibetian Plateau nearly 40 years ago. The fossil was of Denisovan, an enigmatic ancestor species of humans who ranged across Asia until some 50,000 years ago. The discoverymade despite the absence of DNA in the jawhelped scientists understand this species better. In September, another group of researchers further refined the picture of Denisovans whose traces still linger in the DNA of a few modern humans.

In August, descriptions of a nearly 38-lakh-year-old remains of a skull belonging to a bipedal ancestor of humans baffled the world. This skull proved that two of our ancestor speciesA. anamensis and A. afarensismay have overlapped for at least 100,000 years. This evidence of the existence of these two of our ancestor species at a similar timescale busts the long-held belief that human evolution follows a single lineage, i.e. one species coming after the other.

In a first-of-its-kind attempt, scientists have generated an accurate facial representation of people from the Indus Valley Civilisation in October. Nnother important study showed that the ancestral homeland of every human alive today traces back to a region south of the Zambezi River in northern Botswana. Building on the previous genetic evolution studies, the researchers used ethnolinguistic and geographic frequency distribution data from the genomes of over 1000 southern Africans to trace back the origin of modern humans.

Exponential growth continues

India has also contributed immensely in all scientific domains over the past few years and is now only behind China and the US in terms of the number of published research studies. Building exponentially on the success of previous decades, scientists around the world have made immense contributions from improving our daily life to understanding the mysteries of the universe.

With so much exciting research pouring in from all corners of the world, it isn't easy to even keep track of the incredible pace at which science is progressing. While we have tried to cover a few iconic annual scientific highlights in this article, there are thousands of other important discoveries, studies and achievements that shaped science in 2019.

And as yet another potential-filled year dawns on our planet, The Weather Channel India will keep you tuned in about all the exciting news, updates and breakthroughs from the world of science.

So for your daily dose of weather, environment, space and science stories, stay tuned to weather.com and stay curious!

Excerpt from:

Top 5: Scientific Breakthroughs That Made 2019 an Unforgettable Year of Human Progress - The Weather Channel

20 technologies that could change your life in the next decade – Economic Times

The decade thats knocking on our doors now the 2020s is likely to be a time when science fiction manifests itself in our homes and roads and skies as viable, everyday technologies. Cars that can drive themselves. Meat that is derived from plants. Robots that can be fantastic companions both in bed and outside.

Implanting kidneys that can be 3-D printed using your own biomaterial. Using gene editing to eradicate diseases, increase crop yield or fix genetic disorders in human beings. Inserting a swarm of nanobots that can cruise through your blood stream and monitor parameters or unblock arteries. Zipping between Delhi and New York on a hypersonic jet. All of this is likely to become possible or substantially closer to becoming a reality in the next 10 years.

Ideas that have been the staple of science fiction for decades artificial intelligence, universal translators, sex robots, autonomous cars, gene editing and quantum computing are at the cusp of maturity now. Many are ready to move out of labs and enter the mainstream. Expect the next decade to witness breakout years for the world of technology.

Read on:

The 2020s: A new decade promising miraculous tech innovations

Universal translators: End of language barrier

Climate interventions: Clearing the air from carbon

Personalised learning: Pedagogy gets a reboot with AI

Made in a Printer: 3-D printing going to be a new reality

Digital money: End of cash is near, cashless currencies are in vogue

Singularity: An era where machines will out-think human

Mach militaries: Redefining warfare in the 2020

5G & Beyond: Ushering a truly connected world

Technology: Solving the problem of clean water

Quantum computing : Beyond the power of classical computing

Nanotechnology: From science fiction to reality

Power Saver: Energy-storage may be the key to maximise power generation

Secret code: Gene editing could prove to be a game-changer

Love in the time of Robots: The rise of sexbots and artificial human beings

Wheels of the future: Flying cars, hyperloops and e-highways will transform how people travel

New skies, old fears: The good, bad& ugly of drones

Artificial creativity: Computer programs could soon churn out books, movies and music

Meat alternatives: Alternative meat market is expected to grow 10 times by 2029

Intelligent robots & cyborg warriors will lead the charge in battle

Why we first need to focus on the ethical challenges of artificial intelligence

It's time to reflect honestly on our motivations for innovation

India's vital role in new space age

Plastic waste: Environment-friendly packaging technologies will gain traction

Read more:

20 technologies that could change your life in the next decade - Economic Times

2020 will be the beginning of the tech industry’s radical revisioning of the physical world – TechCrunch

These days its easy to bemoan the state of innovation and the dynamism coming from Americas cradle of technological development in Silicon Valley.

The same companies that were praised for reimagining how people organized and accessed knowledge, interacted publicly, shopped for goods and services, conducted business, and even the devices on which all of these things are done, now find themselves criticized for the ways in which theyve abused the tools theyve created to become some of the most profitable and wealthiest ventures in human history.

Before the decade was even half over, the concern over the poverty of purpose inherent in Silicon Valleys inventions were given voice by Peter Thiel a man who has made billions financing the creation of the technologies whose paucity he then bemoaned.

We are no longer living in a technologically accelerating world, Thiel told an audience at Yale University in 2013. There is an incredible sense of deceleration.

In the six years since Thiel spoke to that audience, the only acceleration has been the pace of technologys contribution to the worlds decline.

However, there are some investors who think that the next wave of big technological breakthroughs are just around the corner and that 2020 will be the year that they enter the public consciousness in a real way.

These are the venture capitalists who invest in companies that develop so-called frontier technologies (or deep tech) things like computational biology, artificial intelligence or machine learning, robotics, the space industry, advanced manufacturing using 3D printing, and quantum computing.

Continued advancements in computational power, data management, imaging and sensing technologies, and materials science are bridging researchers ability to observe and understand phenomena with the potential to manipulate them in commercially viable ways.

As a result increasing numbers of technology investors are seeing less risk and more rewards in the formerly arcane areas of investing in innovations.

Established funds will spin up deep tech teams and more funds will be founded to address this market, especially where deep tech meets sustainability, according to Fifty Years investor, Seth Bannon. This shift will be driven from the bottom up (its where the best founder talent is heading) and also from the top down (as more and more institutional LPs want to allocate capital to this space).

In some ways, these investments are going to be driven by political necessity as much as technological advancement, according to Matt Ocko, a managing partner at the venture firm DCVC.

Earlier this year, DCVC closed on $725 million for two investment funds focused on deep technology investing. For Ocko, the geopolitical reality of continuing tensions with China will drive adoption of new technologies that will remake the American industrial economy.

Whether we like it or not, US-government-driven scrutiny of China-based technology will continue in 2020. Less of it will be allowed to be deployed in the US, especially in areas of security, networking, autonomous transportation and space intelligence, writes Ocko, in an email. At the same time, US DoD efforts to streamline procurement processes will result in increasingly tighter partnerships between the DoD and tech sector. The need to bring complex manufacturing, comms, and semiconductor technology home to the US will support a renaissance in distributed manufacturing/advanced manufacturing tech and a strong wave of semiconductor and robotic innovation.

Original post:

2020 will be the beginning of the tech industry's radical revisioning of the physical world - TechCrunch

How quantum computing could beat climate change – World Economic Forum

Imagine being able to cheaply and easily suck carbon directly out of our atmosphere. Such a capability would be hugely powerful in the fight against climate change and advance us towards the ambitious global climate goals set.

Surely thats science fiction? Well, maybe not. Quantum computing may be just the tool we need to design such a clean, safe and easy-to-deploy innovation.

In 1995 I first learned that quantum computing might bring about a revolution akin to the agricultural, industrial and digital ones weve already had. Back then it seemed far-fetched that quantum mechanics could be harnessed to such momentous effect; given recent events, it seems much, much more likely.

Much excitement followed Googles recent announcement of quantum supremacy: [T]he point where quantum computers can do things that classical computers cant, regardless of whether those tasks are useful.

The question now is whether we can develop the large-scale, error-corrected quantum computers that are required to realize profoundly useful applications.

The good news is we already concretely know how to use such fully-fledged quantum computers for many important tasks across science and technology. One such task is the simulation of molecules to determine their properties, interactions, and reactions with other molecules a.k.a. chemistry the very essence of the material world we live in.

While simulating molecules may seem like an esoteric pastime for scientists, it does, in fact, underpin almost every aspect of the world and our activity in it. Understanding their properties unlocks powerful new pharmaceuticals, batteries, clean-energy devices and even innovations for carbon capture.

To date, we havent found a way to simulate large complex molecules with conventional computers, we never will, because the problem is one that grows exponentially with the size or complexity of the molecules being simulated. Crudely speaking, if simulating a molecule with 10 atoms takes a minute, a molecule with 11 takes two minutes, one with 12 atoms takes four minutes and so on. This exponential scaling quickly renders a traditional computer useless: simulating a molecule with just 70 atoms would take longer than the lifetime of the universe (13 billion years).

This is infuriating, not just because we cant simulate existing important molecules that we find (and use) in nature including within our own body and thereby understand their behaviour; but also because there is an infinite number of new molecules that we could design for new applications.

Thats where quantum computers could come to our rescue, thanks to the late, great physicist Richard Feynman. Back in 1981, he recognized that quantum computers could do that which would be impossible for classical computers when it comes to simulating molecules. Thanks to recent work by Microsoft and others we now have concrete recipes for performing these simulations.

One area of urgent practical importance where quantum simulation could be hugely valuable is in meeting the SDGs not only in health, energy, industry, innovation and infrastructure but also in climate action. Examples include room-temperature superconductors (that could reduce the 10% of energy production lost in transmission), more efficient processes to produce nitrogen-based fertilizers that feed the worlds population and new, far more efficient batteries.

One very powerful application of molecular simulation is in the design of new catalysts that speed up chemical reactions. It is estimated that 90% of all commercially produced chemical products involve catalysts (in living systems, theyre called enzymes).

Annual CO2 emissions globally in 2017

A catalyst for scrubbing carbon dioxide directly from the atmosphere could be a powerful tool in tackling climate change. Although CO2 is captured naturally, by oceans and trees, CO2 production has exceeded these natural capture rates for many decades.

The best way to tackle CO2 is not releasing more CO2; the next best thing is capturing it. While we cant literally turn back time, [it] is a bit like rewinding the emissions clock, according to Torben Daeneke at RMIT University.

There are known catalysts for carbon capture but most contain expensive precious metals or are difficult or expensive to produce and/or deploy. We currently dont know many cheap and readily available catalysts for CO2 reduction, says Ulf-Peter Apfel of Ruhr-University Bochum.

Given the infinite number of candidate molecules that are available, we are right to be optimistic that there is a catalyst (or indeed many) to be found that will do the job cheaply and easily. Finding such a catalyst, however, is a daunting task without the ability to simulate the properties of candidate molecules.

And thats where quantum computing could help.

We might even find a cheap catalyst that enables efficient carbon dioxide recycling and produces useful by-products like hydrogen (a fuel) or carbon monoxide (a common source material in the chemical industry).

We can currently simulate small molecules on prototype quantum computers with up to a few dozen qubits (the quantum equivalent of classical computer bits). But scaling this to useful tasks, like discovering new CO2 catalysts, will require error correction and simulation to the order of 1 million qubits.

Its a challenge I have long believed will only be met on any human timescale certainly by the 2030 target for the SDGs if we use the existing manufacturing capability of the silicon chip industry.

At a meeting of the World Economic Forums Global Future Councils last month a team of experts from across industry, academia and beyond assembled to discuss how quantum computing can help address global challenges, as highlighted by the SDGs, and climate in particular.

As co-chair of the Global Future Council on Quantum Computing, I was excited that we were unanimous in agreeing that the world should devote more resources, including in education, to developing the powerful quantum computing capability that could help tackle climate change, meet the SDGs more widely and much more. We enthusiastically called for more international cooperation to develop this important technology on the 2030 timescale to have an impact on delivering the SDGs, in particular climate.

So the real question for me is: can we do it in time? Will we make sufficiently powerful quantum computers on that timeframe? I believe so. There are, of course, many other things we can and should do to tackle climate change, but developing large-scale, error-corrected quantum computers is a hedge we cannot afford to go without.

License and Republishing

World Economic Forum articles may be republished in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Here is the original post:

How quantum computing could beat climate change - World Economic Forum

Quantum computing leaps ahead in 2019 with new power and speed – CNET

A close-up view of the IBM Q quantum computer. The processor is in the silver-colored cylinder.

Quantum computers are getting a lot more real. No, you won't be playing Call of Duty on one anytime soon. But Google, Amazon, Microsoft, Rigetti Computing and IBM all made important advances in 2019 that could help bring computers governed by the weird laws of atomic-scale physics into your life in other ways.

Google's declaration of quantum supremacywas the most headline-grabbing moment in the field. The achievement -- more limited than the grand term might suggest -- demonstrated that quantum computers could someday tackle computing problems beyond the reach of conventional "classical" computers.

Proving quantum computing progress is crucial. We're still several breakthroughs away from realizing the full vision of quantum computing. Qubits, the tiny stores of data that quantum computers use, need to be improved. So do the finicky control systems used to program and read quantum computer results. Still, today's results help justify tomorrow's research funding to sustain the technology when the flashes of hype inevitably fizzle.

Now playing: Watch this: Quantum computing is the new super supercomputer

4:11

Quantum computers will live in data centers, not on your desk, when they're commercialized. They'll still be able to improve many aspects of your life, though. Money in your retirement account might grow a little faster and your packages might be delivered a little sooner as quantum computers find new ways to optimize businesses. Your electric-car battery might be a little lighter and new drugs might help you live a little longer after quantum computers unlock new molecular-level designs. Traffic may be a little lighter from better simulations.

But Google's quantum supremacy step was just one of many needed to fulfill quantum computing's promise.

"We're going to get there in cycles. We're going to have a lot of dark ages in which nothing happens for a long time," said Forrester analyst Brian Hopkins. "One day that new thing will really change the world."

Among the developments in 2019:

Classical computers, which include everything from today's smartwatches to supercomputers that occupy entire buildings, store data as bits that represent either a 1 or a 0. Quantum computers use a different approach called qubits that can represent a combination of 1 and 0 through an idea called superposition.

Ford and Microsoft adapted a quantum computing traffic simulation to run on a classical computer. The result: a traffic routing algorithm that could cut Seattle traffic congestion by 73%.

The states of multiple qubits can be linked, letting quantum computers explore lots of possible solutions to a problem at once. With each new qubit added, a quantum computer can explore double the number of possible solutions, an exponential increase not possible with classical machines.

Quantum computers, however, are finicky. It's hard to get qubits to remain stable long enough to return useful results. The act of communicating with qubits can perturb them. Engineers hope to add error correction techniques so quantum computers can tackle a much broader range of problems.

Plenty of people are quantum computing skeptics. Even some fans of the technology acknowledge we're years away from high-powered quantum computers. But already, quantum computing is a real business. Samsung, Daimler, Honda, JP Morgan Chase and Barclays are all quantum computing customers. Spending on quantum computers should reach hundreds of millions of dollars in the 2020s, and tens of billions in the 2030s, according to forecasts by Deloitte, a consultancy. China, Europe, the United States and Japan have sunk billions of dollars into investment plans. Ford and Microsoft say traffic simulation technology for quantum computers, adapted to run on classical machines, already is showing utility.

Right now quantum computers are used mostly in research. But applications with mainstream results are likely coming. The power of quantum computers is expected to allow for the creation of new materials, chemical processes and medicines by giving insight into the physics of molecules. Quantum computers will also help for greater optimization of financial investments, delivery routes and flights by crunching the numbers in situations with a large number of possible courses of action.

They'll also be used for cracking today's encryption, an idea spy agencies love, even if you might be concerned about losing your privacy or some snoop getting your password. New cryptography adapted for a quantum computing future is already underway.

Another promising application is artificial intelligence, though that may be years in the future.

"Eventually we'll be able to reinvent machine learning," Forrester's Hopkinssaid. But it'll take years of steady work in quantum computing beyond the progress of 2019. "The transformative benefits are real and big, but they are still more sci-fi and theory than they are reality."

Read the rest here:

Quantum computing leaps ahead in 2019 with new power and speed - CNET

Quantum computing will be the smartphone of the 2020s, says Bank of America strategist – MarketWatch

When asked what invention will be as revolutionary in the 2020s as smartphones were in the 2010s, Bank of America strategist Haim Isreal said, without hesitation, quantum computing.

At the banks annual year ahead event last week in New York, Israel qualified his prediction, arguing in an interview with MarketWatch that the timing of the smartphones arrival on the scene in the mid-2000s, and its massive impact on the American business landscape in the 2010s, doesnt line up neatly with quantum-computing breakthroughs, which are only now being seen, just a few weeks before the start of the 2020s.

The iPhone already debuted in 2007, enabling its real impact to be felt in the 2010s, he said, while the first business applications for quantum computing won't be seen till toward the end of the coming decade.

But, Israel argued, when all is said and done, quantum computing could be an even more radical technology in terms of its impact on businesses than the smartphone has been. This is going to be a revolution, he said.

Quantum computing is a nascent technology based on quantum theory in physics which explains the behavior of particles at the subatomic level, and states that until observed these particles can exist in different places at the same time. While normal computers store information in ones and zeros, quantum computers are not limited by the binary nature of current data processing and so can provide exponentially more computing power.

Quantum things can be in multiple places at the same time, said Chris Monroe, a University of Maryland physicist and founder of IonQ told the Associated Press . The rules are very simple, theyre just confounding.

In October, Alphabet Inc. GOOG, -0.44% subsidiary Google claimed to have achieved a breakthrough by using a quantum computer to complete a calculation in 200 seconds on a 53-qubit quantum computing chip, a task it calculated would take the fastest current super-computer 10,000 years. Earlier this month, Amazon.com Inc. AMZN, +1.21% announced its intention to collaborate with experts to develop quantum computing technologies that can be used in conjunction with its cloud computing services. International Business Machines Corp. IBM, +0.07% and Microsoft Corp. MSFT, -0.54% are also developing quantum computing technology.

Israel argued these tools will revolutionize several industries, including health care, the internet of things and cyber security. He said that pharmaceutical companies are most likely to be the first commercial users of these devices, given the explosion of data created by health care research.

Pharma companies are right now subject to Moores law in reverse, he said. They are seeing the cost of drug development doubling every nine years, as the amount of data on the human body becomes ever more onerous to process. Data on genomics doubles every 50 days, he added, arguing that only quantum computers will be able to solve the pharmaceutical industrys big-data problem.

Quantum computing will also have a major impact on cybersecurity, an issue that effects nearly every major corporation today. Currently cyber security relies on cryptographic algorithms, but quantum computings ability to solve these equations in the fraction of the time a normal computer does will render current cyber security methods obsolete.

In the future, even robust cryptographic algorithms will be substantially weakened by quantum computing, while others will no longer be secure at all, according to Swaroop Sham, senior product marketing manager at Okta.

For investors, Israel said, it is key to realize that the first one or two companies to develop commercially applicable quantum-computing will be richly rewarded with access to untold amounts of data and that will only make their software services more valuable to potential customers in a virtuous circle.

What weve learned this decade is that whoever controls the data will win big time, he said.

Read the original post:

Quantum computing will be the smartphone of the 2020s, says Bank of America strategist - MarketWatch

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing – VentureBeat

Quantum is having a moment. In October, Google claimed to have achieved a quantum supremacy milestone. In November, Microsoft announced Azure Quantum, a cloud service that lets you tap into quantum hardware providers Honeywell, IonQ, or QCI. Last week, AWS announced Amazon Braket, a cloud service that lets you tap into quantum hardware providers D-Wave, IonQ, and Rigetti. At the Q2B 2019 quantum computing conference this week, I got a pulse for how the nascent industry is feeling.

Binary digits (bits) are the basic units of information in classical computing, while quantum bits (qubits) make up quantum computing. Bits are always in a state of 0 or 1, while qubits can be in a state of 0, 1, or a superposition of the two. Quantum computing leverages qubits to perform computations that would be much more difficult for a classical computer. Potential applications are so vast and wide (from basic optimization problems to machine learning to all sorts of modeling) that interested industries span finance, chemistry, aerospace, cryptography, and more. But its still so early that the industry is nowhere close to reaching consensus on what the transistor for qubits should look like.

Currently, your cloud quantum computing options are limited to single hardware providers, such as those from D-Wave and IBM. Amazon and Microsoft want to change that.

Enterprises and researchers interested in testing and experimenting with quantum are excited because they will be able to use different quantum processors via the same service, at least in theory. Theyre uneasy, however, because the quantum processors are so fundamentally different that its not clear how easy it will be to switch between them. D-Wave uses quantum annealing, Honeywell and IonQ use ion trap devices, and Rigetti and QCI use superconducting chips. Even the technologies that are the same have completely different architectures.

Entrepreneurs and enthusiasts are hopeful that Amazon and Microsoft will make it easier to interface with the various quantum hardware technologies. Theyre uneasy, however, because Amazon and Microsoft have not shared pricing and technical details. Plus, some of the quantum providers offer their own cloud services, so it will be difficult to suss out when it makes more sense to work with them directly.

The hardware providers themselves are excited because they get exposure to massive customer bases. Amazon and Microsoft are the worlds biggest and second biggest cloud providers, respectively. Theyre uneasy, however, because the tech giants are really just middlemen, which of course poses its own problems of costs and reliance.

At least right now, it looks like this will be the new normal. Even hardware providers that havent announced they are partnering with Amazon and/or Microsoft, like Xanadu, are in talks to do just that.

Overall at the event, excitement trumped uneasiness. If youre participating in a domain as nascent as quantum, you must be optimistic. The news this quarter all happened very quickly, but there is still a long road ahead. After all, these cloud services have only been announced. They still have to become available, gain exposure, pick up traction, become practical, prove useful, and so on.

The devil is in the details. How much are these cloud services for quantum going to cost? Amazon and Microsoft havent said. When exactly will they be available in preview or in beta? Amazon and Microsoft havent said. How will switching between different quantum processors work in practice? Amazon and Microsoft havent said.

One thing is clear. Everyone at the event was talking about the impact of the two biggest cloud providers offering quantum hardware from different companies. The clear winners? Amazon and Microsoft.

ProBeat is a column in which Emil rants about whatever crosses him that week.

Read the rest here:

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat

Could quantum computing be the key to cracking congestion? – SmartCitiesWorld

The technology has helped to improve congestion by 73 per cent in scenario-testing

Ford and Microsoft are using quantum-inspired computing technology to reduce traffic congestion. Through a joint research pilot, scientists have used the technology to simulate thousands of vehicles and their impact on congestion in the US city of Seattle.

Ford said it is still early in the project but encouraging progress has been made and it is further expanding its partnership with the tech giant.

The companies teamed up in 2018 to develop new quantum approaches running on classical computers already available to help reduce Seattles traffic congestion.

Writing on a blog post on Medium.com, Dr Ken Washington, chief technology officer, Ford Motor Company, explained that during rush hour, numerous drivers request the shortest possible routes at the same time, but current navigation services handle these requests "in a vacuum": They do not take into consideration the number of similar incoming requests, including areas where other drivers are all planning to share the same route segments, when delivering results.

What is required is a more balanced routing system that could manage all the various route requests from drivers and provide optimised route suggestions, reducing the number of vehicles on a particular road.

These and more are all variables well need to test for to ensure balanced routing can truly deliver tangible improvements for cities.

Traditional computers dont have the computational power to do this but, as Washington explained, in a quantum computer, information is processed by a quantum bit (or a qubit) and can simultaneously exist "in two different states" before it gets measured.

This ultimately enables a quantum computer to process information with a faster speed, he wrote. Attempts to simulate some specific features of a quantum computer on non-quantum hardware have led to quantum-inspired technology powerful algorithms that mimic certain quantum behaviours and run on specialised conventional hardware. That enables organisations to start realising some benefits before fully scaled quantum hardware becomes available."

Working with Microsoft, Ford tested several different possibilities, including a scenario involving as many as 5,000 vehicles each with 10 different route choices available to them simultaneously requesting routes across Metro Seattle. It reports that in 20 seconds, balanced routing suggestions were delivered to the vehicles that resulted in a 73 per cent improvement in total congestion when compared to selfish routing.

The average commute time, meanwhile, was also cut by eight per cent representing an annual reduction of more than 55,000 hours across this simulated fleet.

Based on these results, Ford is expanding its partnership with Microsoft to further improve the algorithm and understand its effectiveness in more real-world scenarios.

For example, will this method still deliver similar results when some streets are known to be closed, if route options arent equal for all drivers, or if some drivers decide to not follow suggested routes? wrote Washington. These and more are all variables well need to test for to ensure balanced routing can truly deliver tangible improvements for cities.

You might also like:

Read more:

Could quantum computing be the key to cracking congestion? - SmartCitiesWorld

What WON’T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few – Business Wire

OYSTER BAY, N.Y.--(BUSINESS WIRE)--As 2019 winds down, predictions abound on the technology advancements and innovations expected in the year ahead. However, there are several anticipated advancements, including 5G wearables, quantum computing, and self-driving trucks, that will NOT happen in the first year of the new decade, states global tech market advisory firm, ABI Research.

In its new whitepaper, 54 Technology Trends to Watch in 2020, ABI Researchs analysts have identified 35 trends that will shape the technology market and 19 others that, although attracting huge amounts of speculation and commentary, look less likely to move the needle over the next twelve months. After a tumultuous 2019 that was beset by many challenges, both integral to technology markets and derived from global market dynamics, 2020 looks set to be equally challenging, says Stuart Carlaw, Chief Research Officer at ABI Research. Knowing what wont happen in technology in the next year is important for end users, implementors, and vendors to properly place their investments or focus their strategies.

What wont happen in 2020?

5G Wearables: While smartphones will dominate the 5G market in 2020, 5G wearables wont arrive in 2020, or anytime soon, says Stephanie Tomsett, 5G Devices, Smartphones & Wearables analyst at ABI Research. To bring 5G to wearables, specific 5G chipsets will need to be designed and components will need to be reconfigured to fit in the small form factor. That wont begin to happen until 2024, at the earliest.

Quantum Computing: Despite claims from Google in achieving quantum supremacy, the tech industry is still far away from the democratization of quantum computing technology, says Lian Jye Su, AI & Machine Learning Principal Analyst at ABI Research. Quantum computing is definitely not even remotely close to the large-scale commercial deployment stage.

Self-Driving Trucks: Despite numerous headlines declaring the arrival of driverless, self-driving, or robot vehicles, very little, if any, driver-free commercial usage is underway beyond closed-course operations in the United States, says Susan Beardslee, Freight Transportation & Logistics Principal Analyst at ABI Research.

A Consolidated IoT Platform Market: For many years, there have been predictions that the IoT platform supplier market will begin to consolidate, and it just wont happen, says Dan Shey, Vice President of Enabling Platforms at ABI Research. The simple reason is that there are more than 100 companies that offer device-to-cloud IoT platform services and for every one that is acquired, there are always new ones that come to market.

Edge Will Not Overtake Cloud: The accelerated growth of the edge technology and intelligent device paradigm created one of the largest industry misconceptions: edge technology will cannibalize cloud technology, says Kateryna Dubrova, M2M, IoT & IoE Analyst at ABI Research. In fact, in the future, we will see a rapid development of edge-cloud-fog continuum, where technology will complement each other, rather than cross-cannibalize.

8K TVs: Announcements of 8K Television (TV) sets by major vendors earlier in 2019 attracted much attention and raised many of questions within the industry, says Khin Sandi Lynn, Video & Cloud Services Analyst at ABI Research. The fact is, 8K content is not available and the price of 8K TV sets are exorbitant. The transition from high definition (HD) to 4K will continue in 2020 with very limited 8K shipments less than 1 million worldwide.

For more trends that wont happen in 2020, and the 35 trends that will, download the 54 Technology Trends to Watch in 2020 whitepaper.

About ABI Research

ABI Research provides strategic guidance to visionaries, delivering actionable intelligence on the transformative technologies that are dramatically reshaping industries, economies, and workforces across the world. ABI Researchs global team of analysts publish groundbreaking studies often years ahead of other technology advisory firms, empowering our clients to stay ahead of their markets and their competitors.

For more information about ABI Researchs services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific or visit http://www.abiresearch.com.

Excerpt from:

What WON'T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few - Business Wire

Quantum expert Robert Sutor explains the basics of Quantum Computing – Packt Hub

What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

Now for the quantum version, Instead of R, I use H. It too returns 0 or 1 with equal chance, but it has two interesting properties.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

Visit link:

Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

Will quantum computing overwhelm existing security tech in the near future? – Help Net Security

More than half (54%) of cybersecurity professionals have expressed concerns that quantum computing will outpace the development of other security tech, according to a research from Neustar.

Keeping a watchful eye on developments, 74% of organizations admitted to paying close attention to the technologys evolution, with 21% already experimenting with their own quantum computing strategies.

A further 35% of experts claimed to be in the process of developing a quantum strategy, while just 16% said they were not yet thinking about it. This shift in focus comes as the vast majority (73%) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years.

Almost all respondents (93%) believe the next-generation computers will overwhelm existing security technology, with just 7% under the impression that true quantum supremacy will never happen.

Despite expressing concerns that other technologies will be overshadowed, 87% of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13% were more cautious and under the impression that the technology would create more harm than good.

At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years, said Rodney Joffe, Chairman of NISC and Security CTO at Neustar.

Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyberattack unlike anything weve ever seen.

For both todays major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately.

The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organization holding sensitive data should have quantum on their radar.

Quantum computings ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cybersecurity, added Joffe.

The report also highlighted a steep two-year increase on the International Cyber Benchmarks Index. Calculated based on changes in the cybersecurity landscape including the impact of cyberattacks and changing level of threat November 2019 saw the highest score yet at 28.2. In November 2017, the benchmark sat at just 10.1, demonstrating an 18-point increase over the last couple of years.

During September October 2019, security professionals ranked system compromise as the greatest threat to their organizations (22%), with DDoS attacks and ransomware following very closely behind (21%).

Go here to read the rest:

Will quantum computing overwhelm existing security tech in the near future? - Help Net Security

What Was The Most Important Physics Of 2019? – Forbes

So, Ive been doing a bunch of talking in terms of decades in the last couple of posts, about the physics defining eras in the 20th century and the physics defining the last couple of decades. Ill most likely do another decadal post in the near future, this one looking ahead to the 2020s, but the end of a decade by definition falls at the end of a year, so its worth taking a look at physics stories on a shorter time scale, as well.

New year 2019 change to 2020 concept, hand change wooden cubes

You can, as always, find a good list of important physics stories in Physics Worlds Breakthrough of the Year shortlist, and there are plenty of other top science stories of 2019 lists out there. Speaking for myself, this is kind of an unusual year, and its tough to make a call as to the top story. Most of the time, these end-of-year things are either stupidly obvious because one story towers above all the others, or totally subjective because there are a whole bunch of stories of roughly equal importance, and the choice of a single one comes down to personal taste.

In 2019, though, I think there were two stories that are head-and-shoulders above everything else, but roughly equal to each other. Both are the culmination of many years of work, and both can also claim to be kicking off a new era for their respective subfields. And Im really not sure how to choose between them.

US computer scientist Katherine Bouman speaks during a House Committee on Science, Space and ... [+] Technology hearing on the "Event Horizon Telescope: The Black hole seen Round the World" in the Rayburn House office building in Washington, DC on May 16, 2019. (Photo by Andrew CABALLERO-REYNOLDS / AFP) (Photo credit should read ANDREW CABALLERO-REYNOLDS/AFP via Getty Images)

The first of these is the more photogenic of the two, namely the release of the first image of a black hole by the Event Horizon Telescope collaboration back in April. This one made major news all over, and was one of the experiments that led me to call the 2010s the decade of black holes.

As I wrote around the time of the release, this was very much of a piece with the preceding hundred years of tests of general relativity: while many stories referred to the image as a shadow of the black hole, really its a ring produced by light bending around the event horizon. This is the same basic phenomenon that Eddington measured in 1919 looking at the shift in the apparent position of stars near the Sun, providing confirmation of Einsteins prediction that gravity bends light. Its just that scaling up the mass a few million times produces a far more dramatic bending of spacetime (and thus light) than the gentle curve produced by our Sun.

This Feb. 27, 2018, photo shows electronics for use in a quantum computer in the quantum computing ... [+] lab at the IBM Thomas J. Watson Research Center in Yorktown Heights, N.Y. Describing the inner workings of a quantum computer isnt easy, even for top scholars. Thats because the machines process information at the scale of elementary particles such as electrons and photons, where different laws of physics apply. (AP Photo/Seth Wenig)

The other story, in very 2019 fashion, first emerged via a leak: someone at NASA accidentally posted a draft of the paper in which Googles team claimed to have achieved quantum supremacy. They demonstrated reasonably convincingly that their machine took about three and a half minutes to generate a solution to a particular problem that would take vastly longer to solve with a classical computer.

The problem they were working with was very much in the quantum simulation mode that I talked about a year earlier, when I did a high-level overview of quantum computing in general, though a singularly useless version of that. Basically, they took a set of 50-odd qubits and performed a random series of operations on them to put them in a complicated state in which each qubit was in a superposition of multiple states and also entangled with other qubits in the system. Then they measured the probability of finding specific output states.

Qubit, or quantum bit, illustration. The qubit is a unit of quantum information. As a two-state ... [+] system with superposition of both states at the same time, it is fundamental to quantum computing. The illustration shows the Bloch sphere. The north pole is equivalent to one, the south pole to zero. The other locations, anywhere on the surface of the sphere, are quantum superpositions of 0 and 1. When the qubit is measured, the quantum wave function collapses, resulting in an ordinary bit - a one or a zero - which effectively depends on the qubit's 'latitude'. The illustration shows the qubit 'emitting' a stream of wave functions (the Greek letter psi), representing the collapse of the wave function when measured.

Finding the exact distribution of possible outcomes for such a large and entangled system is extremely computationally intensive if youre using a classical computer to do the job, but it happens very naturally in the quantum computer. So they could get a good approximation of the distribution within minutes, while the classical version would take a lot more time, where a lot more time ranges from thousands of years (Googles claim) down to a few days (the claim by a rival group at IBM using a different supercomputer algorithm to run the computation). If youd like a lot more technical detail about what this did and didnt do, see Scott Aaronson.

As with the EHT paper, this is the culmination of years of work by a large team of people. Its also very much of a piece with past work quantum computing as a distinct field is a recent development, but really, the fundamental equations used to do the calculations were pretty well set by 1935.

Glowing new technology in deep space, computer generated abstract background, 3D rendering

Both of these projects also have a solid claim to be at the forefront of something new. The EHT image is the first to be produced, but wont be the last theyre crunching numbers on the Sag A* black hole at the center of the Milky Way, and theres room to improve their imaging in the future. Along with the LIGO discovery from a few years ago, this is the start of a new era of looking directly at black holes, rather than just using them as a playground for theory.

Googles demonstration of quantum supremacy, meanwhile, is the first such result in a highly competitive field: IBM and Microsoft are also invested in similar machines, and there are smaller companies and academic labs exploring other technologies. The random-sampling problem they used is convenient for this sort of demonstration, but not really useful for anything else, but lots of people are hard at work on techniques to make a next generation of machines that will be able to do calculations where people care about the answer. Theres a good long way to go, yet, but a lot of activity in the field driving things forward.

So, in the head-to-head matchup for Top Physics Story of 2019, these two are remarkably evenly matched, and it could really go either way. The EHT result has a slightly deeper history, the Google quantum computer arguably has a brighter future. My inclination would be to split the award between them; if you put a gun to my head and made me pick one, Id go with quantum supremacy, but Id seriously question the life choices that led you to this place, because theyre both awesome accomplishments that deserve to be celebrated.

Read the original here:

What Was The Most Important Physics Of 2019? - Forbes

Quantum Technology Expert to Discuss Quantum Sensors for Defense Applications at Office of Naval Research (ONR) – Business Wire

ARLINGTON, Va.--(BUSINESS WIRE)--Michael J. Biercuk, founder and CEO of Q-CTRL, will describe how quantum sensors may provide exceptional new capabilities to the warfighter at the Office of Naval Research (ONR) on Jan. 13, 2020, as part of the ONRs 2020 Distinguished Lecture Series.

Quantum sensing is considered one of the most promising areas in the global research effort to leverage the exotic properties of quantum physics for real-world benefit. In his lecture titled Quantum Control as a Means to Improve Quantum Sensing in Realistic Environments, Biercuk will describe how new concepts in quantum control engineering applied to these sensors could dramatically enhance standoff detection and precision navigation and timing in military settings.

Biercuk is one of the worlds leading experts in the field of quantum technology. In 2017, he founded Q-CTRL based on research he led at the Quantum Control Lab at the University of Sydney, where he is a professor of Quantum Physics and Quantum Technology.

Funded by some of the worlds leading investors, including Silicon Valley-based Sierra Ventures and Sequoia Capital, Q-CTRL is dedicated to helping teams realize the true potential of quantum hardware, from sensing to quantum computing. In quantum computing, the team is known for its efforts in reducing hardware errors caused by environmental noise. Computational errors are considered a major obstacle in the development of useful quantum computers and sought-after breakthroughs in science and industry.

Now in its 11th year, the ONR Distinguished Lecture Series features groundbreaking innovators who have made a major impact on past research or are working on discoveries for the future. It is designed to stimulate discussion and collaboration among scientists and engineers representing Navy research, the Department of Defense, industry and academia.

Past speakers include Michael Posner, recipient of the National Medal of Science; Mark Hersam, MacArthur Genius Award recipient and leading experimentalist in the field of nanotechnology; and Dr. Robert Ballard, the deep-sea explorer best-known for recovering the wreck of the RMS Titanic.

I am honored to be taking part in this renowned lecture series, Biercuk said. Quantum technology, which harnesses quantum physics as a resource, is likely to be as transformational in the 21st century as harnessing electricity was in the 19th. I look forward to sharing insights into how Q-CTRLs efforts can accelerate the development of this new field of technology for defense applications.

About the Office of Naval Research

The Department of the Navys Office of Naval Research provides the science and technology necessary to maintain the Navy and Marine Corps technological advantage. Through its affiliates, ONR is a leader in science and technology with engagement in 50 states, 55 countries, 634 institutions of higher learning and nonprofit institutions, and more than 960 industry partners.

ABOUT Q-CTRL

Q-CTRL was founded in November 2017 and is a venture-capital-backed company that provides control-engineering software solutions to help customers harness the power of quantum physics in next-generation technologies.

Q-CTRL is built on Professor Michael J. Biercuks research leading the Quantum Control Lab at the University of Sydney, where he is a Professor of Quantum Physics and Quantum Technology.

The teams expertise led Q-CTRL to be selected as an inaugural member of the IBM Q startup network in 2018. Q-CTRL is funded by SquarePeg Capital, Sierra Ventures, Sequoia Capital China, Data Collective, Horizons Ventures and Main Sequence Ventures.

See more here:

Quantum Technology Expert to Discuss Quantum Sensors for Defense Applications at Office of Naval Research (ONR) - Business Wire

Shaping the technology transforming our society – Fermi National Accelerator Laboratory

Technology and society are intertwined. Self-driving cars and facial recognition technologies are no longer science fiction, and data and efficiency are harbingers of this new world.

But these new technologies are only the beginning. In the coming decades, further advances in artificial intelligence and the dawn of quantum computing are poised to change lives in both discernible and inconspicuous ways.

Even everyday technology, like a smartphone app, affects people in significant ways that they might not realize, said Fermilab scientist Daniel Bowring. If there are concerns about something as familiar as an app, then we need to take more opaque and complicated technology, like AI, very seriously.

A two-day workshop took place from Oct. 31-Nov.1 at the University of Chicago to raise awareness and generate strategies for the ethical development and implementation of AI and quantum computing. The workshop was organized by the Chicago Quantum Exchange, a Chicago-based intellectual hub and community of researchers whose aim is to promote the exploration of quantum information technologies, and funded by the Kavli Foundation and the Center for Data and Computing, a University of Chicago center for research driven by data science and AI approaches.

Members of the Chicago Quantum Exchange engage in conversation at a workshop at the University of Chicago. Photo: Anne Ryan, University of Chicago

At the workshop, industry experts, physicists, sociologists, journalists and more gathered to learn, share insights and identify next steps as AI and quantum computing advance.

AI and quantum computing are developing tools that will affect everyone, said Bowring, a member of the workshop organizing team. It was important to us to get as many stakeholders in the room as possible.

Workshop participants listened to presentations that framed concerns such as power asymmetries, algorithmic bias and privacy before breaking out into small groups to deliberate these topics and develop actionable strategies. Groups reported to all attendees after each breakout session. On the last day of the workshop, participants considered how they would nurture the dialogue.

At one of the breakout sessions, participants discussed the balance between collaborative quantum computing research and national security. Today, the results of quantum computing research are dispersed in a wide variety of academic journals, and a lot of code is accessible and open source. However, because of its potential implications for cybersecurity and encryption, quantum computing is also of interest to national security, so it may be subject to intelligence and export controls. What endeavors, if any, should be open source or private? Are these outcomes realizable? What level of control should be maintained? How should these technologies be regulated?

Were already behind on setting ground rules for these technologies, which, if left to progress on their own, could increase power asymmetries in society, said Brian Nord, Fermilab and University of Chicago scientist and member of the workshop organizing team. Our research programs, for example, need to be crafted in a way that does not reinforce or exacerbate these asymmetries.

Workshop participants will continue the dialogue through online and in-person meetings to address key ethical and societal issues in the quantum and AI space. Potential future activities include writing proposals for joint research projects that consider ethical and societal implications, white papers addressed to academic audiences, and media editorials and developing community action plans.

Organizers are planning to hold a panel next spring to engage the public, as well.

The spring event will help us continue to spread awareness and engage a variety of groups on issues of ethics in AI and quantum computing, Nord said.

The workshop was sponsored by the Kavli Foundation in partnership with the Center for Data and Computing at the University of Chicago. Artificial intelligence and quantum information science are two of six initiatives identified as special priority by the Department of Energy Office of Science.

The Kavli Foundation is dedicated to advancing science for the benefit of humanity, promoting public understanding of scientific research, and supporting scientists and their work. The foundations mission is implemented through an international program of research institutes, initiatives and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics, as well as the Kavli Prize and a program in public engagement with science. Visitkavlifoundation.org.

The Chicago Quantum Exchange catalyzes research activity across disciplines and member institutions. It is anchored by the University of Chicago, Argonne National Laboratory, Fermi National Accelerator Laboratory, and the University of Illinois at Urbana-Champaign and includes the University of Wisconsin-Madison, Northwestern University and industry partners. Visit chicagoquantum.org.

Original post:

Shaping the technology transforming our society - Fermi National Accelerator Laboratory

Where will the first big gains in quantum computing be? – Quantaneo, the Quantum Computing Source

Current quantum computers are far from where we need them to be for practical applications due to their high level of noise (errors). If we cannot find a way to use these current and near-term quantum computers, we will need to wait for fully-error-corrected universal machines to be developed to see real significant benefit (15-20 years by many estimates). This is where the software becomes much more than a necessary complement to the hardware. Quantum software has the potential to significantly accelerate our pathway to practically useful quantum computers. Quantum algorithms Most quantum algorithms developed to date cannot be run on near-term quantum computers, however there are some that can. One particular class of algorithm, variational quantum algorithms, is a lead contender for being able to demonstrate near-term quantum advantage. Variational quantum algorithms These algorithms allow users to change control parameters of the quantum computer until results match a target property, such as the energy of a molecule highly relevant to battery manufacturing, room temperature superconductivity, drug discovery and fertilizer manufacturing. Variational quantum algorithms have already been used to successfully simulate small chemical systems on quantum computers over the last two years, by our team at Rahko and a small handful of teams across the globe. Chemical Simulation Broadly speaking, in chemical simulation we look at two types of calculations: 1. Fast, low-cost, low-precision calculations that neglect exact quantum properties 2. High-precision, high-cost calculations Typically, the first type of calculation is used to filter large pools of candidates, such as candidate drugs. Once a pool has been filtered to a much smaller pool, the second type of calculation is performed to verify exact candidate properties. This mix allows an optimal use of computational resources. Quantum computing will likely not directly help with the first type of calculation (low-cost, low-precision), as quantum computing is inherently more expensive and slower. Machine learning (ML)-based approaches, however, do offer a speedup here. At Rahko, part of our work is in developing classical ML approaches to deliver faster classical solutions for this type of calculation. We can then use quantum computers to generate training data to improve classical ML algorithms. For the second type of calculation (high-cost, high-precision), quantum computers will bring far greater accuracy at reduced cost. Most importantly, quantum computers will be able to produce accurate simulations where classical methods fail. This will be a game-changing improvement when working with strongly correlated materials, which play a huge role in batteries and room temperature superconductivity. However, we still face the problem of noise in near-term machines. There is a solution: quantum machine learning. Quantum machine learning (QML) Over the past two years, ML based approaches to running quantum algorithms have borne out powerful results. Several algorithms have been proposed that, when combined, allow QML approaches to be 10,000,000 times faster than traditional variational quantum algorithms. This means that QML approaches will enable practical gains months, even years, before other variational quantum methods succeed. Our team at Rahko is working hard to deliver these gains for the past two years we have developed Hyrax, a QML platform that allows us to rapidly build, test and deploy QML algorithms. Hyrax relies heavily on variational quantum algorithms and powers all of our state-of-the-art research, helping us to push forward on a QML-enabled pathway to the first commercially valuable practical applications of quantum computing. With Hyrax, we aim to follow in the footsteps of world leading, UK-born quantum chemistry software, in the tradition of packages such as ONETEP and CASTEP. The UK quantum future I strongly believe that QML will play a key role in the UK quantum future. Investment in QML talent and ventures will give the UK an opportunity to uphold its leading role in quantum chemistry, and a lead role in global quantum computing at large. This piece was first published as a guest blog post on techUK as part of the techUK Quantum Future Campaign week.

Visit link:

Where will the first big gains in quantum computing be? - Quantaneo, the Quantum Computing Source

Woodside joins IBMs quantum computing network and flags further AI advances – Which-50

Oil and gas giant Woodside Energy announced a new collaboration with IBM to continue to advance its AI efforts and explore use cases for quantum computing.

As part of the collaboration Woodside will become a member of the MIT-IBM Watson AI Lab, which is a collaborative industrial-academic laboratory focused on advancing fundamental AI research.

Woodside is also the first commercial Australian organisation to join the IBM Q Network, a community of Fortune 500 companies, academic institutions, start-ups and national research labs working with IBM to advance quantum computing.

Woodside and IBM will use quantum computing to conduct deep computational simulations across the value chain of Woodsides business, the companies said.

Speaking at IBMs Cloud Innovation Exchange in Sydney yesterday, Woodside CEO Peter Coleman explained quantum computing could help with cybersecurity efforts to protect critical infrastructure as well as with the basic physics of what happens in our plant, particularly around flow assurance leading to more accurate predictions for the business operations.

We can see those things coming, and theyre coming very very rapidly. And I think those who are not already dealing with are going to get left behind, very quickly, Coleman said.

The announcement builds on Woodsides five-year relationship with IBM, centred largely around cognitive projects.

Looking back to 2013, Coleman said the company saw promising results from its data and analytics practice and wanted to make a big bet on AI.

Rather than do the easy stuff which is generally put AI in a call centre I said, weve got to go holistically at this and we will go straight into it as a company, he said.

The first use case the company selected was an AI system which responds to staff queries to surface the most relevant information from the companys corpus. There are now 25 million documents loaded in Watson and 80 per cent of employees use Watson on a daily basis, Coleman said.

Coleman flagged further AI use cases as the company embarks on its next wave of mega projects.Woodside is planning to spend US$30 billion on projects over the next six years and will use AI to identify materials and check if they match what has been ordered.

The CEO also expects AI to cut Woodsides US$1 billion maintenance bill by as much as 30 per cent by using AI to identify insulated cladding which has corroded.

Woodside is also working to build a cognitive plant that is able to operate itself, with assistance from NASA.

Commenting on the partnership, IBM CEO Ginni Rometty said, IBM is excited to join with Woodside, one of our first Watson clients globally, to help enable their pioneering vision of developing an intelligent plant.

Together, Woodside and IBM will push the frontiers of innovation, working with the worlds most advanced researchers in quantum computing and next generation AI.

Read more from the original source:

Woodside joins IBMs quantum computing network and flags further AI advances - Which-50

Blockchain Must Solve These 3 Issues to Avoid Quantum Threat: Expert – Cointelegraph

The blockchain community should immediately begin working on three issues to prevent being overtaken by quantum computers, a cryptography expert says.

Xinxin Fan, head of cryptography at privacy- and IoT-focused blockchain platform IoTeX, published an article in The International Business Times on Nov. 7, calling on the blockchain community to stay up to date about the progress being made on quantum computers.

While reiterating that short-term developments in quantum computing are modest, Fan argued that blockchains will have to keep pace to avoid being overtaken by quantum computers as the technology grows and improves.

As such, Fan outlined three major directions for the blockchain community to address as soon as possible, which are the standardization of quantum-resistant cryptography, cryptographic agility and blockchain governance.

According to the expert, the first direction is a process to standardized quantum-resistant cryptography as it develops. Fan noted that quantum-resistant cryptography tech has already been initiated by the National Institute of Standards and Technology.

Stressing the need for such standardization, Fan wrote:

Developing and implementing capabilities specifically designed to resist quantum computers will be key for the future of blockchains, as well as their survival. Blockchain supporters and developers should therefore closely monitor the standardization process and prepare to integrate the results into existing and future blockchain projects.

Next is cryptographic agility. Simply put, this concerns developers ability to implement quantum-resistant upgrades to existing blockchain networks.

The expert cited the Ethereum network as an example, emphasizing the importance of such platforms being able to regularly upgrade their systems due to the large number of projects that depend on them.

The third important issue is blockchain governance. According to Fan, blockchain projects must set up procedures to clearly define when and how to deploy quantum-safe upgrades to their networks.

Given the difficulty blockchains have faced in establishing optimal governance structures, the expert argued that the blockchain community should start seriously thinking and experimenting with ways to ensure governance is not a hindrance to the improvement of technology.

He concluded:

There is no doubt that quantum computing is coming, and it will have major effects across the technology space. But those who believe that its simple existence is a death knell for blockchain fail to consider that the latter will grow and evolve alongside quantum computing. There is much that can be done to make blockchains more dynamic and robust and if we do those things, we will not have to worry about quantum supremacy any time soon.

On Oct. 25, Ethereum co-founder Vitalik Buterin delivered his opinion on the issue of quantum supremacy, saying:

My one-sentence impression of recent quantum supremacy stuff so far is that it is to real quantum computing what hydrogen bombs are to nuclear fusion. Proof that a phenomenon and the capability to extract power from it exist, but still far from directed use toward useful things.

Previously, Bitcoin (BTC) educator Andreas Antonopoulos claimed that Google's latest developments in quantum computing have had no impact on Bitcoin.

Continued here:

Blockchain Must Solve These 3 Issues to Avoid Quantum Threat: Expert - Cointelegraph

ORNL’s Humble Tapped to Lead New ACM Quantum Computing Journal – Quantaneo, the Quantum Computing Source

The journal focuses on the theory and practice of quantum computing, a new discipline that applies the principles of quantum mechanics to computation and has enormous potential for innovation across the scientific spectrum. Quantum computers use units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional bits have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, at the same time, allowing for a vast number of possibilities for storing data.

This novel approach to computing is expected to produce systems exponentially more powerful than todays leading classical computing systems. This potential is underscored by the recent demonstration of a quantum processor exceeding the simulation power of ORNLs Summit supercomputer, the fastest and smartest in the world, when running a benchmark known as random circuit sampling.

The simulations took 200 seconds on the 53-qubit quantum computer, which was built by Google and dubbed Sycamore; after running the same simulations on Summit the team extrapolated that the calculations would have taken the worlds most powerful system more than 10,000 years to complete with current state-of-the-art algorithms, providing experimental evidence of quantum supremacy and critical information for the design of future quantum computers. I am excited by the potential for quantum computers to provide new capabilities for scientific exploration and understanding. The new journal from ACM provides an important forum to discuss how advances in quantum computer science can accelerate the development and application of this exciting technology, said Humble.

Transactions on Quantum Computing will publish its first issue in 2020. According to ACM:

The journal focuses on the theory and practice of quantum computing including but not limited to: models of quantum computing, quantum algorithms and complexity, quantum computing architecture, principles and methods of fault-tolerant quantum computation, design automation for quantum computing, quantum programming languages and systems, distributed quantum computing, quantum networking, issues related to quantum hardware and NISQ implementation quantum security and privacy, and applications (e.g. in machine learning and AI) of quantum computing.

Humble serves as the Director of ORNLs Quantum Computing Institute (QCI), which currently hosts a concerted effort to harness theory, computation, and experiment to test the capabilities of emerging quantum computing technologies, which can then be applied to the modeling and simulation of complex physical processes. This requires a multidisciplinary team of computer scientists, physicists, and engineers working in concert to advance the field.

The Labs quantum computing effort is leveraging partnerships with academia, industry, and government to accelerate the understanding of how near-term quantum computing resources might benefit early applications. And in partnership with the laboratorys National Center for Computational Sciences, QCIs Quantum Computing User Program provides early access to existing, commercial quantum computing systems while supporting the development of future quantum programmers through educational outreach and internship programs.

Humble received his doctorate in theoretical chemistry from the University of Oregon before coming to ORNL in 2005. Dr. Humble leads the Quantum Computing Team in the Quantum Information Science Group. He is also an associate professor with the Bredesen Center for Interdisciplinary Research and Graduate Education at the University of Tennessee and an Associate Editor for the Quantum Information Processing journal.

Information about submissions and other aspects of the journal can be found on the journals new website: https://tqc.acm.org.

Read the rest here:

ORNL's Humble Tapped to Lead New ACM Quantum Computing Journal - Quantaneo, the Quantum Computing Source

Microsoft continues tradition of ‘big and bold’ bets for future – Small Business

Josh Holmes, Microsoft, addresses TechX Dublin 2019 (Image: Microsoft)

TechX cloud conference hears of guiding principles for future development

Print

Read More: 2019 cloud computing Microsoft Microsoft TechX R&D research and development

In association with Microsoft.

Cloud computing was very much the focus of Microsofts TechX Summit in Dublin, in the context of a platform on which great things could be achieved.

Digital transformation, new business models, new applications, leveraging new technologies such as artificial intelligence (AI), machine learning (ML) and quantum computing, were all highlighted as examples of what can be done.

Cathriona Hallahan, managing director, Microsoft Ireland, talked about technology as a force for good, and co-creating value with partners and customers, highlighting how Microsoft has evolved as a company.

Josh Holmes, principal technical programme manager, Microsoft, in his keynote developed that evolution story.

Citing the founding myths of the likes of Hewlett Packard and other tech companies, he said they often started with a bold claim that was only later backed up. Microsoft, he said, was no different.

When Paul Allen and Bill Gates first pitched the idea of selling software independent of the then target machine, the Altair 8800, they actually had no code to show. Holmes said, despite not having a machine on which to develop their code, the pair simulated it, developed the BASIC interpreter and went ahead anyway. And the rest is now part of the legend of early computing.

It was big bet, a bold bet and that is still infused in Microsoft culture today, said Holmes.

While the Gates era mission statement of a computer on every desk and in every home served Microsoft well over the years, it is now firmly supplanted by the Satya Nadella era one of to empower every person and every organisation on the planet to achieve more.

Speaking to TechPro, Holmes said, This is a mission statement I can believe in. This is one I can get behind.

That statement is already in effect, but it also has a strong future, thanks to a $12 billion investment by Nadella in R&D, to make those bold bets, said Holmes.

Guiding those efforts are a few key principles.

Firstly, is to be bold, regardless of size.

The example given to illustrate this was Microsofts Project Natick that saw it take a 12 metre pressure vessel, containing some 864 servers, with a 27.6 petabyte storage capacity, and sink it 30 metres down off the north coast of Scotland.

The result of an open submission White Paper during a Think Week event, the Natick trial saw the vessel operate for 90 days submerged, enjoying free cooling via ambient sea water and current flows, to operate as a lights-out data centre. It was so successful, a second trial aimed for 18 months submersion.

Half the worlds population lives within 200 km of a coast, said Holmes, and 30 metres down in any ocean, there is a consistent low temperature. Plus, renewable energy, via wind, wave and solar, is generally more available in such locations. Thus, Natick made sense on many levels and continues to be developed as a means to deploy fast commission, close to the customer data centre infrastructure that is of minimal impact to the environment in its operation.

A bold bet, says Holmes, even on a small scale. Though that scale, he asserts, is held back only by the deployment infrastructure, not the compute power, energy needs or endurance of the vessel. It is simply the infrastructure to deploy the vessels that constrains.

Project Natick has applications in many areas, as cloud expands, demanding faster access to data, closer to where it is gathered and used, and edge computing develops apace to accommodate these and other needs.

Another theme for current future developments is optimism and inclusion. All too often we hear of the obstacles for those to who are differently abled, whether on a physical, sensory or mental level.

Holmes cited the example of a group called Wounded Warriors that were re-engineering Xbox controllers for wounded veterans who simply wanted to enjoy games. A Microsoft engineer, Matt Hite heard of the efforts and wanted to help. The result was the Xbox Adaptive Controller. Enhanced with additional capabilities to handle more sensors and inputs, including a co-pilot feature for two controllers to be linked, the Adaptive Controller allows players to use whatever range of mobility and control they have to play.

Holmes quoted Bryce Johnson, a senior inclusive designer on the Xbox team, We are not trying to design for all of us, we are trying to design for each of us.

Having built on the work of the Wounded Warrior efforts and co-created a whole new capability, the controller was priced to make it just as accessible 89.99 or $99.

Grounded on trust was another key theme for development direction, said Holmes, and this was characterised by Project InnerEye.

This is a project that develops ML techniques for automatic delineation of tumours, as well as healthy anatomy in 3D radiological images. The technology enables the extraction of targeted radiomics measurements for quantitative radiology, efficient contouring for radiotherapy planning, and precise surgery planning and navigation. This means that InnerEye turns multi-dimensional radiological images into measuring devices. They can map, measure and track the development of tumours for human decision support, allowing far more targeted, effective and timely interventions than was possibly previously.

Microsoft has worked with partners on the project, such as Terarecon, to embed the technology in applications developed from long experience in the medical field, and radiology in particular.

Rather than do all of this ourselves, said Holmes, we opened it up to the partners to fully implement.

This key approach, allowing partners to fully leverage co-created technologies, said Holmes, has proven hugely beneficial.

According to figures based on financial reports and IDC estimates, in the context of the partner ecosystem, for every $1 Microsoft makes, its partners make $9.64.

The last development principle is that of execution at scale.

Under this heading, Holmes talked about Microsofts work in quantum computing.

Coming full circle to the Allen and Gates big bet, Microsoft is using simulation and emulation to allow the development of code and programming techniques that are preparing developers for the new paradigm of quantum computing.

Q# is a language in which to develop for quantum computers. Microsoft has already made available a quantum developer kit to prepare developers, teams and organisations for the availability of quantum platforms.

The developer kit, simulation, and free training tools, all allow people to think in that mindset, said Holmes. They will be able to solve problems and be proficient by the time the real platforms become available.

Quantum computing research is already impacting how development is directed today, Holmes confirmed. He said the often cited example of todays levels of encryption being ineffective in a quantum computing environment is well known, but added it will also have profound effects for identity and access management.

Passwords will be entirely obsolete, with layered biometrics as the likely replacement. But people need to be prepared now for such eventualities, smoothing any transition, he said.

With the advent of efficient quantum machines, such as the topological quantum computer concept, it may become possible to produce realistic simulations of the human brain, and with it the potential for general AI.

This raises serious ethical questions, said Holmes, and requires the development of frameworks for informed and responsible development.

With these guiding principles of being bold, but inclusive and responsible, Microsoft aims to fulfil its mission statement of empowerment for everyone, everywhere, concluded Holmes.

TechCentral Reporters

Read More: 2019 cloud computing Microsoft Microsoft TechX R&D research and development

See the article here:

Microsoft continues tradition of 'big and bold' bets for future - Small Business

5G networks and quantum computing: risks cyber insurers need to wrap their arms around – Insurance Business

The world is moving towards 5G mobile technology the fifth generation of mobile network. Its a new kind of network that will not only interconnect people, but also machines, devices and objects. 5G is expected to have a totally transformative impact on global industry, enabling firms to offer connected services with top performance and efficiency, and at a low cost.

According to multinational semiconductor and telecommunications firm Qualcomm, based in the US, the global 5G standard will advance mobile from largely a set of technologies connecting people-to-people and people-to-information to a unified connectivity fabric connecting people to everything. The firm expects 5Gs full economic benefit to be realized worldwide by 2035, when it could produce up to US$12.3 trillion worth of goods and services enabled by 5G mobile technology.

Read next: Rates for cyber insurance are 'dangerously low'

Its a real force to be reckoned with, and something insurers need to be thinking about right now, according to Brad Gow (pictured), global cyber product leader at Sompo International. As 5G mobile technology spreads around the world, networks will move away from hardware-based switching to more distributed and software-defined digital routing, which involves many more nodes communicating with each other.

What that does is it really opens up the surface area thats vulnerable to cyberattacks, said Gow. And so, as we approach 5G in the next couple of years, we need to think about network security. The way that networks are secured today is going to need to be completely re-thought in order to incorporate all of this technology and all of this new bandwidth. Thats going to really change the game for cyber insurers and its going to be a real challenge for the insurance industry. It has certainly captured my attention because a lot of this technology will be coming online in the next two or three years.

Its not just the emergence of 5G technology thats caught the attention of Sompo Internationals global cyber product leader. Theres also the growing prominence of quantum computing. Data scientists worldwide have tipped quantum computing to change the world in the near future. Sparing the technical details of quantum computing (which are incredibly complex), its benefits are clear: it can process massive and complex datasets much more efficiently than classical computers.

Its still early days in quantum computing, but once the power of computer processing expands exponentially, current encryption technology is going to be rendered obsolete, Gow commented. Thats significant because we depend upon encryption very, very heavily today.

Read more: Ransomware "big game hunting" has insurers on the ropes

While quantum computers in the wrong hands could pose some serious cybersecurity challenges, in the right hands theyre doing the world of good. Quantum computers are already being used to reinvent aspects of cybersecurity because of their ability to break codes and encrypt electronic communications. What insurers need to do, according to Gow, is keep their fingers on the pulse of these advancements.

Were in a period of rapid technological change, he told Insurance Business. To some degree, it could be argued that the insurance industry has gotten a little bit over its skis in terms of the breadth of coverage its offering, and the prices its offering, especially when there are so many variables that have the ability to threaten corporate networks, not only today but over the next few years. It will be interesting to see how this all develops.

See the original post:

5G networks and quantum computing: risks cyber insurers need to wrap their arms around - Insurance Business