12345...


Where will the first big gains in quantum computing be? – Quantaneo, the Quantum Computing Source

Current quantum computers are far from where we need them to be for practical applications due to their high level of noise (errors). If we cannot find a way to use these current and near-term quantum computers, we will need to wait for fully-error-corrected universal machines to be developed to see real significant benefit (15-20 years by many estimates). This is where the software becomes much more than a necessary complement to the hardware. Quantum software has the potential to significantly accelerate our pathway to practically useful quantum computers. Quantum algorithms Most quantum algorithms developed to date cannot be run on near-term quantum computers, however there are some that can. One particular class of algorithm, variational quantum algorithms, is a lead contender for being able to demonstrate near-term quantum advantage. Variational quantum algorithms These algorithms allow users to change control parameters of the quantum computer until results match a target property, such as the energy of a molecule highly relevant to battery manufacturing, room temperature superconductivity, drug discovery and fertilizer manufacturing. Variational quantum algorithms have already been used to successfully simulate small chemical systems on quantum computers over the last two years, by our team at Rahko and a small handful of teams across the globe. Chemical Simulation Broadly speaking, in chemical simulation we look at two types of calculations: 1. Fast, low-cost, low-precision calculations that neglect exact quantum properties 2. High-precision, high-cost calculations Typically, the first type of calculation is used to filter large pools of candidates, such as candidate drugs. Once a pool has been filtered to a much smaller pool, the second type of calculation is performed to verify exact candidate properties. This mix allows an optimal use of computational resources. Quantum computing will likely not directly help with the first type of calculation (low-cost, low-precision), as quantum computing is inherently more expensive and slower. Machine learning (ML)-based approaches, however, do offer a speedup here. At Rahko, part of our work is in developing classical ML approaches to deliver faster classical solutions for this type of calculation. We can then use quantum computers to generate training data to improve classical ML algorithms. For the second type of calculation (high-cost, high-precision), quantum computers will bring far greater accuracy at reduced cost. Most importantly, quantum computers will be able to produce accurate simulations where classical methods fail. This will be a game-changing improvement when working with strongly correlated materials, which play a huge role in batteries and room temperature superconductivity. However, we still face the problem of noise in near-term machines. There is a solution: quantum machine learning. Quantum machine learning (QML) Over the past two years, ML based approaches to running quantum algorithms have borne out powerful results. Several algorithms have been proposed that, when combined, allow QML approaches to be 10,000,000 times faster than traditional variational quantum algorithms. This means that QML approaches will enable practical gains months, even years, before other variational quantum methods succeed. Our team at Rahko is working hard to deliver these gains for the past two years we have developed Hyrax, a QML platform that allows us to rapidly build, test and deploy QML algorithms. Hyrax relies heavily on variational quantum algorithms and powers all of our state-of-the-art research, helping us to push forward on a QML-enabled pathway to the first commercially valuable practical applications of quantum computing. With Hyrax, we aim to follow in the footsteps of world leading, UK-born quantum chemistry software, in the tradition of packages such as ONETEP and CASTEP. The UK quantum future I strongly believe that QML will play a key role in the UK quantum future. Investment in QML talent and ventures will give the UK an opportunity to uphold its leading role in quantum chemistry, and a lead role in global quantum computing at large. This piece was first published as a guest blog post on techUK as part of the techUK Quantum Future Campaign week.

Visit link:

Where will the first big gains in quantum computing be? - Quantaneo, the Quantum Computing Source

Woodside joins IBMs quantum computing network and flags further AI advances – Which-50

Oil and gas giant Woodside Energy announced a new collaboration with IBM to continue to advance its AI efforts and explore use cases for quantum computing.

As part of the collaboration Woodside will become a member of the MIT-IBM Watson AI Lab, which is a collaborative industrial-academic laboratory focused on advancing fundamental AI research.

Woodside is also the first commercial Australian organisation to join the IBM Q Network, a community of Fortune 500 companies, academic institutions, start-ups and national research labs working with IBM to advance quantum computing.

Woodside and IBM will use quantum computing to conduct deep computational simulations across the value chain of Woodsides business, the companies said.

Speaking at IBMs Cloud Innovation Exchange in Sydney yesterday, Woodside CEO Peter Coleman explained quantum computing could help with cybersecurity efforts to protect critical infrastructure as well as with the basic physics of what happens in our plant, particularly around flow assurance leading to more accurate predictions for the business operations.

We can see those things coming, and theyre coming very very rapidly. And I think those who are not already dealing with are going to get left behind, very quickly, Coleman said.

The announcement builds on Woodsides five-year relationship with IBM, centred largely around cognitive projects.

Looking back to 2013, Coleman said the company saw promising results from its data and analytics practice and wanted to make a big bet on AI.

Rather than do the easy stuff which is generally put AI in a call centre I said, weve got to go holistically at this and we will go straight into it as a company, he said.

The first use case the company selected was an AI system which responds to staff queries to surface the most relevant information from the companys corpus. There are now 25 million documents loaded in Watson and 80 per cent of employees use Watson on a daily basis, Coleman said.

Coleman flagged further AI use cases as the company embarks on its next wave of mega projects.Woodside is planning to spend US$30 billion on projects over the next six years and will use AI to identify materials and check if they match what has been ordered.

The CEO also expects AI to cut Woodsides US$1 billion maintenance bill by as much as 30 per cent by using AI to identify insulated cladding which has corroded.

Woodside is also working to build a cognitive plant that is able to operate itself, with assistance from NASA.

Commenting on the partnership, IBM CEO Ginni Rometty said, IBM is excited to join with Woodside, one of our first Watson clients globally, to help enable their pioneering vision of developing an intelligent plant.

Together, Woodside and IBM will push the frontiers of innovation, working with the worlds most advanced researchers in quantum computing and next generation AI.

Read more from the original source:

Woodside joins IBMs quantum computing network and flags further AI advances - Which-50

Blockchain Must Solve These 3 Issues to Avoid Quantum Threat: Expert – Cointelegraph

The blockchain community should immediately begin working on three issues to prevent being overtaken by quantum computers, a cryptography expert says.

Xinxin Fan, head of cryptography at privacy- and IoT-focused blockchain platform IoTeX, published an article in The International Business Times on Nov. 7, calling on the blockchain community to stay up to date about the progress being made on quantum computers.

While reiterating that short-term developments in quantum computing are modest, Fan argued that blockchains will have to keep pace to avoid being overtaken by quantum computers as the technology grows and improves.

As such, Fan outlined three major directions for the blockchain community to address as soon as possible, which are the standardization of quantum-resistant cryptography, cryptographic agility and blockchain governance.

According to the expert, the first direction is a process to standardized quantum-resistant cryptography as it develops. Fan noted that quantum-resistant cryptography tech has already been initiated by the National Institute of Standards and Technology.

Stressing the need for such standardization, Fan wrote:

Developing and implementing capabilities specifically designed to resist quantum computers will be key for the future of blockchains, as well as their survival. Blockchain supporters and developers should therefore closely monitor the standardization process and prepare to integrate the results into existing and future blockchain projects.

Next is cryptographic agility. Simply put, this concerns developers ability to implement quantum-resistant upgrades to existing blockchain networks.

The expert cited the Ethereum network as an example, emphasizing the importance of such platforms being able to regularly upgrade their systems due to the large number of projects that depend on them.

The third important issue is blockchain governance. According to Fan, blockchain projects must set up procedures to clearly define when and how to deploy quantum-safe upgrades to their networks.

Given the difficulty blockchains have faced in establishing optimal governance structures, the expert argued that the blockchain community should start seriously thinking and experimenting with ways to ensure governance is not a hindrance to the improvement of technology.

He concluded:

There is no doubt that quantum computing is coming, and it will have major effects across the technology space. But those who believe that its simple existence is a death knell for blockchain fail to consider that the latter will grow and evolve alongside quantum computing. There is much that can be done to make blockchains more dynamic and robust and if we do those things, we will not have to worry about quantum supremacy any time soon.

On Oct. 25, Ethereum co-founder Vitalik Buterin delivered his opinion on the issue of quantum supremacy, saying:

My one-sentence impression of recent quantum supremacy stuff so far is that it is to real quantum computing what hydrogen bombs are to nuclear fusion. Proof that a phenomenon and the capability to extract power from it exist, but still far from directed use toward useful things.

Previously, Bitcoin (BTC) educator Andreas Antonopoulos claimed that Google's latest developments in quantum computing have had no impact on Bitcoin.

Continued here:

Blockchain Must Solve These 3 Issues to Avoid Quantum Threat: Expert - Cointelegraph

ORNL’s Humble Tapped to Lead New ACM Quantum Computing Journal – Quantaneo, the Quantum Computing Source

The journal focuses on the theory and practice of quantum computing, a new discipline that applies the principles of quantum mechanics to computation and has enormous potential for innovation across the scientific spectrum. Quantum computers use units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional bits have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, at the same time, allowing for a vast number of possibilities for storing data.

This novel approach to computing is expected to produce systems exponentially more powerful than todays leading classical computing systems. This potential is underscored by the recent demonstration of a quantum processor exceeding the simulation power of ORNLs Summit supercomputer, the fastest and smartest in the world, when running a benchmark known as random circuit sampling.

The simulations took 200 seconds on the 53-qubit quantum computer, which was built by Google and dubbed Sycamore; after running the same simulations on Summit the team extrapolated that the calculations would have taken the worlds most powerful system more than 10,000 years to complete with current state-of-the-art algorithms, providing experimental evidence of quantum supremacy and critical information for the design of future quantum computers. I am excited by the potential for quantum computers to provide new capabilities for scientific exploration and understanding. The new journal from ACM provides an important forum to discuss how advances in quantum computer science can accelerate the development and application of this exciting technology, said Humble.

Transactions on Quantum Computing will publish its first issue in 2020. According to ACM:

The journal focuses on the theory and practice of quantum computing including but not limited to: models of quantum computing, quantum algorithms and complexity, quantum computing architecture, principles and methods of fault-tolerant quantum computation, design automation for quantum computing, quantum programming languages and systems, distributed quantum computing, quantum networking, issues related to quantum hardware and NISQ implementation quantum security and privacy, and applications (e.g. in machine learning and AI) of quantum computing.

Humble serves as the Director of ORNLs Quantum Computing Institute (QCI), which currently hosts a concerted effort to harness theory, computation, and experiment to test the capabilities of emerging quantum computing technologies, which can then be applied to the modeling and simulation of complex physical processes. This requires a multidisciplinary team of computer scientists, physicists, and engineers working in concert to advance the field.

The Labs quantum computing effort is leveraging partnerships with academia, industry, and government to accelerate the understanding of how near-term quantum computing resources might benefit early applications. And in partnership with the laboratorys National Center for Computational Sciences, QCIs Quantum Computing User Program provides early access to existing, commercial quantum computing systems while supporting the development of future quantum programmers through educational outreach and internship programs.

Humble received his doctorate in theoretical chemistry from the University of Oregon before coming to ORNL in 2005. Dr. Humble leads the Quantum Computing Team in the Quantum Information Science Group. He is also an associate professor with the Bredesen Center for Interdisciplinary Research and Graduate Education at the University of Tennessee and an Associate Editor for the Quantum Information Processing journal.

Information about submissions and other aspects of the journal can be found on the journals new website: https://tqc.acm.org.

Read the rest here:

ORNL's Humble Tapped to Lead New ACM Quantum Computing Journal - Quantaneo, the Quantum Computing Source

Microsoft continues tradition of ‘big and bold’ bets for future – Small Business

Josh Holmes, Microsoft, addresses TechX Dublin 2019 (Image: Microsoft)

TechX cloud conference hears of guiding principles for future development

Print

Read More: 2019 cloud computing Microsoft Microsoft TechX R&D research and development

In association with Microsoft.

Cloud computing was very much the focus of Microsofts TechX Summit in Dublin, in the context of a platform on which great things could be achieved.

Digital transformation, new business models, new applications, leveraging new technologies such as artificial intelligence (AI), machine learning (ML) and quantum computing, were all highlighted as examples of what can be done.

Cathriona Hallahan, managing director, Microsoft Ireland, talked about technology as a force for good, and co-creating value with partners and customers, highlighting how Microsoft has evolved as a company.

Josh Holmes, principal technical programme manager, Microsoft, in his keynote developed that evolution story.

Citing the founding myths of the likes of Hewlett Packard and other tech companies, he said they often started with a bold claim that was only later backed up. Microsoft, he said, was no different.

When Paul Allen and Bill Gates first pitched the idea of selling software independent of the then target machine, the Altair 8800, they actually had no code to show. Holmes said, despite not having a machine on which to develop their code, the pair simulated it, developed the BASIC interpreter and went ahead anyway. And the rest is now part of the legend of early computing.

It was big bet, a bold bet and that is still infused in Microsoft culture today, said Holmes.

While the Gates era mission statement of a computer on every desk and in every home served Microsoft well over the years, it is now firmly supplanted by the Satya Nadella era one of to empower every person and every organisation on the planet to achieve more.

Speaking to TechPro, Holmes said, This is a mission statement I can believe in. This is one I can get behind.

That statement is already in effect, but it also has a strong future, thanks to a $12 billion investment by Nadella in R&D, to make those bold bets, said Holmes.

Guiding those efforts are a few key principles.

Firstly, is to be bold, regardless of size.

The example given to illustrate this was Microsofts Project Natick that saw it take a 12 metre pressure vessel, containing some 864 servers, with a 27.6 petabyte storage capacity, and sink it 30 metres down off the north coast of Scotland.

The result of an open submission White Paper during a Think Week event, the Natick trial saw the vessel operate for 90 days submerged, enjoying free cooling via ambient sea water and current flows, to operate as a lights-out data centre. It was so successful, a second trial aimed for 18 months submersion.

Half the worlds population lives within 200 km of a coast, said Holmes, and 30 metres down in any ocean, there is a consistent low temperature. Plus, renewable energy, via wind, wave and solar, is generally more available in such locations. Thus, Natick made sense on many levels and continues to be developed as a means to deploy fast commission, close to the customer data centre infrastructure that is of minimal impact to the environment in its operation.

A bold bet, says Holmes, even on a small scale. Though that scale, he asserts, is held back only by the deployment infrastructure, not the compute power, energy needs or endurance of the vessel. It is simply the infrastructure to deploy the vessels that constrains.

Project Natick has applications in many areas, as cloud expands, demanding faster access to data, closer to where it is gathered and used, and edge computing develops apace to accommodate these and other needs.

Another theme for current future developments is optimism and inclusion. All too often we hear of the obstacles for those to who are differently abled, whether on a physical, sensory or mental level.

Holmes cited the example of a group called Wounded Warriors that were re-engineering Xbox controllers for wounded veterans who simply wanted to enjoy games. A Microsoft engineer, Matt Hite heard of the efforts and wanted to help. The result was the Xbox Adaptive Controller. Enhanced with additional capabilities to handle more sensors and inputs, including a co-pilot feature for two controllers to be linked, the Adaptive Controller allows players to use whatever range of mobility and control they have to play.

Holmes quoted Bryce Johnson, a senior inclusive designer on the Xbox team, We are not trying to design for all of us, we are trying to design for each of us.

Having built on the work of the Wounded Warrior efforts and co-created a whole new capability, the controller was priced to make it just as accessible 89.99 or $99.

Grounded on trust was another key theme for development direction, said Holmes, and this was characterised by Project InnerEye.

This is a project that develops ML techniques for automatic delineation of tumours, as well as healthy anatomy in 3D radiological images. The technology enables the extraction of targeted radiomics measurements for quantitative radiology, efficient contouring for radiotherapy planning, and precise surgery planning and navigation. This means that InnerEye turns multi-dimensional radiological images into measuring devices. They can map, measure and track the development of tumours for human decision support, allowing far more targeted, effective and timely interventions than was possibly previously.

Microsoft has worked with partners on the project, such as Terarecon, to embed the technology in applications developed from long experience in the medical field, and radiology in particular.

Rather than do all of this ourselves, said Holmes, we opened it up to the partners to fully implement.

This key approach, allowing partners to fully leverage co-created technologies, said Holmes, has proven hugely beneficial.

According to figures based on financial reports and IDC estimates, in the context of the partner ecosystem, for every $1 Microsoft makes, its partners make $9.64.

The last development principle is that of execution at scale.

Under this heading, Holmes talked about Microsofts work in quantum computing.

Coming full circle to the Allen and Gates big bet, Microsoft is using simulation and emulation to allow the development of code and programming techniques that are preparing developers for the new paradigm of quantum computing.

Q# is a language in which to develop for quantum computers. Microsoft has already made available a quantum developer kit to prepare developers, teams and organisations for the availability of quantum platforms.

The developer kit, simulation, and free training tools, all allow people to think in that mindset, said Holmes. They will be able to solve problems and be proficient by the time the real platforms become available.

Quantum computing research is already impacting how development is directed today, Holmes confirmed. He said the often cited example of todays levels of encryption being ineffective in a quantum computing environment is well known, but added it will also have profound effects for identity and access management.

Passwords will be entirely obsolete, with layered biometrics as the likely replacement. But people need to be prepared now for such eventualities, smoothing any transition, he said.

With the advent of efficient quantum machines, such as the topological quantum computer concept, it may become possible to produce realistic simulations of the human brain, and with it the potential for general AI.

This raises serious ethical questions, said Holmes, and requires the development of frameworks for informed and responsible development.

With these guiding principles of being bold, but inclusive and responsible, Microsoft aims to fulfil its mission statement of empowerment for everyone, everywhere, concluded Holmes.

TechCentral Reporters

Read More: 2019 cloud computing Microsoft Microsoft TechX R&D research and development

See the article here:

Microsoft continues tradition of 'big and bold' bets for future - Small Business

5G networks and quantum computing: risks cyber insurers need to wrap their arms around – Insurance Business

The world is moving towards 5G mobile technology the fifth generation of mobile network. Its a new kind of network that will not only interconnect people, but also machines, devices and objects. 5G is expected to have a totally transformative impact on global industry, enabling firms to offer connected services with top performance and efficiency, and at a low cost.

According to multinational semiconductor and telecommunications firm Qualcomm, based in the US, the global 5G standard will advance mobile from largely a set of technologies connecting people-to-people and people-to-information to a unified connectivity fabric connecting people to everything. The firm expects 5Gs full economic benefit to be realized worldwide by 2035, when it could produce up to US$12.3 trillion worth of goods and services enabled by 5G mobile technology.

Read next: Rates for cyber insurance are 'dangerously low'

Its a real force to be reckoned with, and something insurers need to be thinking about right now, according to Brad Gow (pictured), global cyber product leader at Sompo International. As 5G mobile technology spreads around the world, networks will move away from hardware-based switching to more distributed and software-defined digital routing, which involves many more nodes communicating with each other.

What that does is it really opens up the surface area thats vulnerable to cyberattacks, said Gow. And so, as we approach 5G in the next couple of years, we need to think about network security. The way that networks are secured today is going to need to be completely re-thought in order to incorporate all of this technology and all of this new bandwidth. Thats going to really change the game for cyber insurers and its going to be a real challenge for the insurance industry. It has certainly captured my attention because a lot of this technology will be coming online in the next two or three years.

Its not just the emergence of 5G technology thats caught the attention of Sompo Internationals global cyber product leader. Theres also the growing prominence of quantum computing. Data scientists worldwide have tipped quantum computing to change the world in the near future. Sparing the technical details of quantum computing (which are incredibly complex), its benefits are clear: it can process massive and complex datasets much more efficiently than classical computers.

Its still early days in quantum computing, but once the power of computer processing expands exponentially, current encryption technology is going to be rendered obsolete, Gow commented. Thats significant because we depend upon encryption very, very heavily today.

Read more: Ransomware "big game hunting" has insurers on the ropes

While quantum computers in the wrong hands could pose some serious cybersecurity challenges, in the right hands theyre doing the world of good. Quantum computers are already being used to reinvent aspects of cybersecurity because of their ability to break codes and encrypt electronic communications. What insurers need to do, according to Gow, is keep their fingers on the pulse of these advancements.

Were in a period of rapid technological change, he told Insurance Business. To some degree, it could be argued that the insurance industry has gotten a little bit over its skis in terms of the breadth of coverage its offering, and the prices its offering, especially when there are so many variables that have the ability to threaten corporate networks, not only today but over the next few years. It will be interesting to see how this all develops.

See the original post:

5G networks and quantum computing: risks cyber insurers need to wrap their arms around - Insurance Business

Microsoft CEO says Azure Quantum will address the big challenges in computing – GeekWire

Microsoft CEO Satya Nadella introduces the companys new initiatives in quantum computing at the Microsoft Ignite conference in Orlando, Fla. (Microsoft Video)

Microsoft CEO Satya Nadella today took the wraps off Azure Quantum, a full-stack, cloud-based approach to quantum computing that he said would play well with traditional computational architectures.

With all the capacity we have around computing, we still have many unsolved problems, whether its around food safety, or climate change, or the energy transition, Nadella said at the Microsoft Ignite conference in Orlando, Fla. These are big challenges that need more computing. We need general-purpose quantum.

While classical computers deal in binary bits of ones and zeroes, quantum computers can take advantage of spooky physics to process quantum bits or qubits that can represent multiple values simultaneously.

For years, Microsoft and its rivals have been laying the groundwork for general-purpose quantum computing hardware and software. Microsoft has previously announced some elements of its strategy, including its Q# programming language and Quantum Development Kit, but today Nadella put all the pieces together.

A private preview of Azure Quantum is due to be launched in the coming months, with the Microsoft Quantum Network serving as the primary point of contact for developers and startups. Well have a variety of hardware solutions that are all going to be open in Azure, Nadella said.

Microsofts hardware partners include IonQ and Honeywell, which are working on quantum computing systems based on trapped ions; as well as Quantum Circuits Inc., which uses Lego-like assemblies of superconducting circuits.

Nadella said Azure Quantum will offer a complete toolkit of open-source software including Microsofts Q# and QDK, as well as 1QBits software platform and services.

End-to-end quantum computing based on Microsofts topological qubit architecture may not yet be ready for prime time, but Nadella highlighted a quantum on classical approach, in which quantum tools are used alongside classical computation to optimize the algorithms for simulating complex phenomena.

Weve seen, in fact, many use cases already, across health care, across finance and the electrical grid as well, Nadella said.

He threw a video spotlight on a medical diagnostic technique called magnetic resonance fingerprinting, which is being pioneered by Case Western Reserve University and the Cleveland Clinic.

The technique uses quantum-inspired algorithms to optimize MRI scans, based on the patients precise position inside the scanner. Once the scan is done, the 3-D visualization can be viewed using Microsofts HoloLens augmented-reality headset.

Working with Azure has given us improvement in speed and about a 30% improvement in precision, Mark Griswold, a professor of radiology at CWRU, said on the video. The results were getting are allowing us to see diseases earlier than befor, and to quantify the treatments that were giving.

Other early users include:

Microsoft isnt alone in efforts to explore the frontiers of quantum computing. D-Wave Systems, which is headquartered in Burnaby, B.C., has been developing a cloud-based service that takes advantage of a special-purpose optimization technology known as quantum annealing.

Meanwhile, IBM, Google and other heavyweights of the computer industry are neck-and-neck with Microsoft in the race to create general-purpose quantum devices. Just a couple of weeks ago, Google researchers and their partners published a research paper claiming that they had achieved quantum supremacy over classical computation for a specific algorithm that generates random numbers.

See the article here:

Microsoft CEO says Azure Quantum will address the big challenges in computing - GeekWire

Quantum computers: why Google, NASA and others are putting their chips on these dream machines – World Economic Forum

In 1936, Alan Turing proposed the Turing machine, which became the foundational reference point for theories about computing and computers. Around the same time, Konrad Zuse invented the Z1 computer, considered to be the first electromagnetic binary computer.

What happened next is history, and in our world today, computers are everywhere. Our lives are dramatically different from how they were even at the end of the 20th century, and our mobile phones have far more powerful CPUs than desktop computers did only few years ago. The advent of the Internet of Things brings computer power into every minute detail of our lives. The world wide web has had such a transformative effect on society that many people can't even remember a life before they were online.

The major catalyst behind this transformation was the discovery of silicon, and its use in the production of good transistors. This occurred over a period of more than 100 years, dating from when Michael Faraday first recorded the semiconductor effect in 1833, via Morris Tanenbaum, who built the first silicon transistor at Bell Labs in 1954, to the first integrated circuit in 1960.

We are about to embark on a similar journey in our quest for building the next-generation computer. Quantum physics, which emerged in the early 20th century, is so powerful and yet so unlike anything known before that even the inventors had a hard time understanding it in detail.

In the early 1980s, Richard Feynman, Paul Benioff and Yuri Manin provided the groundwork for a completely new paradigm of quantum computing, introducing the idea that quantum computing had the potential to solve problems that classical computing could not. And so quantum computing came into its own.

Peter Shor published an algorithm in 1994 capable of efficiently solving problems in cryptography that are hard to solve for classical computers that is, the vast majority of computers used today. In fact, Shor's algorithm continues to threaten the fundaments of most encryption deployed across the globe.

The problem was that, in 1994, there was no quantum computer in sight. In 1997, the first tiny quantum computer was built, but the field really took off only when the Canadian startup D-Wave revealed its 28-qubit quantum computer in 2007.

Similar to the trajectory of non-quantum communication, which took more than 100 years from discovery to mass use, quantum computers are now maturing very quickly. Today, many players are engaged in a battle over who can build the first powerful quantum computer. These include commercial entities such as IonQ, Rigetti, IBM, Google, Alibaba, Microsoft and Intel, while virtually all major nation states are spending billions of dollars on quantum computing development and research.

Quantum computers are powerful yet so difficult to build that whoever can crack the code will have a lasting powerful advantage. This cannot be understated. Heres a striking example of the power of quantum computing.

Quantum leaps: growth over the years

Image: Statista

To break a widely used RSA 2048-bit encryption, a classical computer with one trillion operations per second would need around 300 trillion years. This is such a long time that we all feel very safe.

A quantum computer using Shor's algorithm could achieve the same feat in just 10 seconds, with a modest 1 million operations per second. That's the power of quantum computers: 300 trillion years versus 10 seconds.

Another reason why nation states pour so much money into the field is precisely because, with it being so difficult, any achievement will directly yield a lasting advantage.

So where are quantum computers today, and where are they headed?

Considering the immense challenges to building quantum computers, I'd say we are roughly where we were in around 1970 with classical computers. We have some quantum computers, but they are still pretty unreliable compared to today's standard. We call them NISQ devices - Noisy Intermediate-Scale Quantum devices. Noisy because they are pretty bad, and intermediate-scale because of their small qubit number. But they work. There are a few public quantum computers available for anyone to programme on. IBM, Rigetti, Google and IonQ all provide public access with open-source tools to real quantum computing hardware. IBM even sells a quantum computer that you can put in your own data centre (the IBM Q System One).

But these are not yet powerful enough to break RSA 2048-bit keys, and probably won't be for another 10 to 20 years.

The comparison date of 1970 works from another angle, too. In October 1969, researchers sent the first message over the internet (it was called ARPANET then). When they tried to send the one word "login", the system crashed after sending "l" and "o". It later recovered and the message was successfully sent.

Today, we are also building a quantum communication system that doesn't communicate bits and bytes, but quantum states that quantum computers can understand. This is important so that we can build up a quantum version of the internet.

D-Wave, NASA, Google and the Universities Space Research Association created the D-Wave 1,097-qubit quantum computer.

Image: Reuters/Stephen Lam

It is also important as a way of encrypting communication, since the quantum channel provides some inherent physical guarantees about a transmission. Without going into too much detail, there is a fundamental property whereby the simple act of wiretapping or listening into a communication will be made detectable to the parties communicating. Not because they have a fancy system setup, but because of fundamental properties of the quantum channel.

But quantum computers are not just useful for cryptography applications and communication. One of the most immediate applications is in machine-learning, where we are already today on the cusp of a quantum advantage meaning that the quantum algorithm will outperform any classical algorithm. It is believed that quantum advantage for machine-learning can be achieved within the next 6-12 months. The near-term applications for quantum computing are endless: cryptography, machine-learning, chemistry, optimization, communication and many more. And this is just the start, with research increasingly extending to other areas.

Google and NASA have just announced that they have achieved 'quantum supremacy'. That is the ability of quantum computers to perform certain tasks that a classical computer simply cannot do in a reasonable timeframe. Their quantum computer solved a problem in 200 seconds that would take the worlds fastest supercomputer 10,000 years.

The problem that was solved is without any practical merits or implications, yet it demonstrates the huge potential quantum computers have and the ability to unlock that potential in the coming years.

This opens up a completely new era where we can now focus on building quantum computers with practical benefits and while this will still be many years away, it will be the new frontier in computation.

License and Republishing

World Economic Forum articles may be republished in accordance with our Terms of Use.

Written by

Andreas Baumhof, Vice President Quantum Technologies, QuintessenceLabs

The views expressed in this article are those of the author alone and not the World Economic Forum.

See the rest here:

Quantum computers: why Google, NASA and others are putting their chips on these dream machines - World Economic Forum

Opinion | Quantum supremacy and the cat thats neither alive nor dead – Livemint

There is this joke about a cat that belonged to a gentleman called Schrdinger: Schrdingers cat walks into a bar. And doesnt."

If you chuckled, you must have been a student of quantum physics. Austrian physicist Erwin Schrdingers Cat Theory is a paradox that explains the seeming contradiction between what we see with our naked eye and what quantum theory says actually is in its microscopic state. He used this to disprove something called the Copenhagen Interpretation" of quantum mechanics. This interpretation states that a particle exists in all states at once until observed". Schrdingers cat is in a box and could be alive or dead. But, till the box is opened, you wont know its state. This would mean that the cat could be both alive and dead at the same time.

Now, hold that thought while we leap from cats to computers. The ones that we use now follow the principles of a Turing machine. Here, information is encoded into bits (either 1s or 0s) and one can apply a series of operations (and, or, not) to those bits to perform any computation. A quantum computer is different, it uses qubits or the quantum analogue of bits. Now, jump back to the cat. Much like the feline in Schrdingers box, a qubit is not always 0 or 1, but can be both at the same time. Only at the end of the computation or when the box is opened, would you know which, but during the computation process, its exact state is indeterminate.

If this leaves you scratching your head, do not fret. In a 2017 Wall Street Journal interview, here is what Bill Gates said: I know a lot of physics and a lot of math. But the one place where they put up slides and it is hieroglyphics, its quantum." Even Einstein had some difficulty grasping the concept and famously dismissed it with, God does not play dice with the universe."

What makes a quantum computer exciting is its ability to exploit these properties of quantum physics to perform certain calculations far more efficiently and faster than any supercomputer. Thus, megacorps such as Microsoft, IBM, and Google have been working on quantum computers. Last week, Google claimed to have achieved quantum supremacy, or the point when such a computer can perform a calculation that a traditional one cannot complete within its lifetime. Googles quantum computer took 200 seconds for a calculation that would take a supercomputer 10,000 years.

While all this is impressive, what does it mean for us? Its hard to fully answer this, as we are venturing into an entirely new area, and the future will reveal applications we have not even imagined yet. Its a bit like classical computing. We did not know how it will totally revolutionize our world. In the same manner, quantum computing could be a game-changer for many industries.

Take big data and analytics. We produce 3 exabits of data every day, equivalent to 300,000 Libraries of Congress. Classical computers are reaching their limits of processing power. However, with exponentially more powerful quantum computers, we could spot unseen patterns in large data sets, integrate data from different data sets, and tackle the whole problem at once. This would be rocket fuel for artificial intelligence (AI), with quantum computing offering quick feedbacks and collapsing the learning curve of machines. This will make AI more intuitive, expand to various industries and help build artificial general intelligence.

Online security will be impacted, with our current data encryption strategies wilting under the assault of quantum power. On the other hand, there will be formidable new cryptographic methods like quantum key distribution, where even if the message gets intercepted, no one can read it (the Cat, again). On a side note, the security of every public blockchain will be under threat from quantum hacks. It was no coincidence that Bitcoins price slumped the day Google announced its breakthrough. Quantum computing could speed up drug development by reviewing multiple molecules simultaneously, quickly sequencing individual DNAs for personalized drugs. Another application lies in weather forecasting and, more importantly, climate-change predictions. It will require the tremendous power of quantum computing to create complex, ever-changing weather models to properly predict and respond to the climate cataclysm that awaits us.

Its a brave new world of quantum computing were entering, and we will discover its possibilities as we go along. If you feel youve got it but are still confused, thats okayit is the nature of this beast. Just step out of the box.

Jaspreet Bindra is a digital transformation and technology expert, and the author of the book The Tech Whisperer

Read more:

Opinion | Quantum supremacy and the cat thats neither alive nor dead - Livemint

Other voices: Welcome to the age of Quantum computing – St. Paul Pioneer Press

Has the era of quantum computing finally dawned? In a field long plagued by hype and hubris, theres reason for some cautious optimism.

A team of scientists at Googles research lab announced last week in the journal Nature that they had built a quantum computer that could perform calculations in about 200 seconds that would take a classical supercomputer some 10,000 years to do. An age of quantum supremacy was duly declared.

Rather uncharitably, IBM researchers were quick to point out that the feat was less than advertised. They estimated that by using all of the hard disk space at the worlds most powerful classical computer, the Summit OLCF-4 at Oak Ridge National Laboratory, they could do the same calculation in 2.5 days, not 10,000 years. Googles claim to have achieved quantum supremacy that is, to have accomplished a task that traditional computers cant was premature.

This was to miss the bigger picture: A rudimentary quantum machine has improved on the fastest supercomputer ever built by a factor of 1,080 an immense achievement by any measure. Although the specific problem that Googles computer solved wont have much practical significance, simply getting the technology to work was a triumph; comparisons to the Wright brothers early flights arent far off the mark.

So is the world prepared for what comes next?

Quantum computers, to put it mildly, defy human intuition. They take advantage of the strange ways that matter behaves at the subatomic level to make calculations at extraordinary speed. In theory, they could one day lead to substantial advances in materials science, artificial intelligence, medicine, finance, communications, logistics and more. In all likelihood, no one has thought up the best uses for them yet.

They also pose some risks worth paying attention to. One is that the global race to master quantum computing is heating up, with unpredictable consequences. Last year, President Donald Trumps administration signed a $1.1 billion bill to prioritize the technology, which is a decent start. But the U.S. will need to do more to retain its global leadership. Congress should fund basic research at labs and universities, ensure the U.S. welcomes immigrants with relevant skills, invest in cutting-edge infrastructure, and use the governments vast leverage as a consumer to support promising quantum technologies.

A more distant worry is that advanced quantum computers could one day threaten the public-key cryptography that protects information across the digital world. Those systems are based on hard math problems that quantum computers might theoretically be able to crack with ease. Security researchers are well aware of the problem, and at work on creating post-quantum systems and standards. But vigilance and serious investment is nonetheless called for.

No doubt, the quantum-computing era will have its share of false starts, dashed hopes and fiendishly difficult problems to overcome. As Google is showing, though, thats how technology advances: bit by bit, into a very strange future.

Bloomberg Opinion

Read this article:

Other voices: Welcome to the age of Quantum computing - St. Paul Pioneer Press

Volkswagen : optimizing traffic flow with quantum computers – Quantaneo, the Quantum Computing Source

Volkswagen is launching in Lisbon the world's first pilot project for traffic optimization using a quantum computer. For this purpose, the Group is equipping MAN buses of the city of Lisbon with a traffic management system developed in-house. This system uses a D-Wave quantum computer and calculates the fastest route for each of the nine participating buses individually and almost in real-time. This way, passengers' travel times will be significantly reduced, even during peak traffic periods, and traffic flow will be improved. Volkswagen is testing its traffic optimization system during the WebSummit technology conference in Lisbon from November 4 to 8 - during the conference, buses will carry thousands of passengers through the city traffic in Lisbon.

Martin Hofmann, Volkswagen Group CIO, says: 'At Volkswagen, we want to further expand our expert knowledge in the field of quantum computing and to develop an in-depth understanding of the way this technology can be put to meaningful use within the company. Traffic optimization is one of the potential applications. Smart traffic management based on the performance capabilities of a quantum computer can provide effective support for cities and commuters.'

Vern Brownell, CEO of D-Wave, says: 'Volkswagen's use of quantum computing to tackle pervasive global problems like smart traffic management is an example of the real-world impact quantum applications will soon have on our cities, communities, and everyday lives. Since we built the first commercial quantum computer, D-Wave has been focused on designing systems that enable quantum application development and deliver business value. Volkswagen's pilot project is among the first that we know of to make production use of a quantum computer, and their ongoing innovation brings us closer than ever to realizing true, practical quantum computing.'

System includes two components: passenger number prediction and route optimization

The Volkswagen traffic management system includes two components - passenger number prediction and route optimization by quantum computing. For predictions, the development team from Volkswagen is using data analytics tools to identify stops with especially high passenger numbers at certain times. For this purpose, anonymized geo-coordinates and passenger flow data are used. The objective is to offer as many people as possible tailor-made transport possibilities and to ensure optimum utilization of the bus fleet.

For the pilot project in Lisbon, 26 stops were selected and connected to form four bus links. For example, one of these runs from the WebSummit conference facility to the Marqus de Pombal traffic node in the city center.

The Volkswagen team intends to continue the development of this prediction component. The idea is that bus operators should add temporary links to their scheduled services to serve stops with the largest passenger numbers. This would be a meaningful approach for major events in the city area, for example.

The Volkswagen experts have developed a quantum algorithm for route optimization between the stops. This algorithm calculates the fastest route for each individual bus in the fleet and optimizes it almost on a real-time basis. In contrast to conventional navigation services, the quantum algorithm assigns each bus an individual route. This way, each bus can drive around traffic bottlenecks along the route at an early stage and avoid traffic jams before they even arise.

The experts from Volkswagen expect this development to have a further positive effect. As the buses travel along individually optimized routes which are calculated to ensure that they can never cause congestion themselves, there will be a general improvement in traffic flow within the city.

Volkswagen intends to develop the system to market maturity

In the future, Volkswagen plans to develop its traffic optimization system to market maturity. For this reason, the Volkswagen developers have designed the system so that it can generally be applied to any city and to vehicle fleets of any size. Further pilot projects for cities in Germany and other European countries are already being considered. Volkswagen believes that such a traffic optimization system could be offered to public transport companies, taxi companies or fleet operators.

Volkswagen and quantum computing

Volkswagen is cooperating with its technology partners D-Wave and Google, who provide the experts with access to their computer systems. In 2016, the Volkswagen team already successfully demonstrated congestion-free route optimization for taxis in the Chinese capital Beijing. Since then, the development of the algorithm has been steadily continued and it has been protected by patents in the USA.

The rest is here:

Volkswagen : optimizing traffic flow with quantum computers - Quantaneo, the Quantum Computing Source

IBM picked a fight with Google over its claims of ‘quantum supremacy.’ Here’s why experts say the feud could shake up the tech industry’s balance of…

Most people probably couldn't tell you what quantum computing is. And, as we learned last week from an unusual public spat between tech companies, it turns out that the top quantum-computing engineers aren't so sure either.

It all started when Google researchers published a paper in the journal Nature declaring that they achieved "quantum supremacy" a breakthrough in computing speed so radical that, to use a fictional analogy, it might be akin to attaining hyperspace travel speed.

But before the champagne had even been poured, IBM was disputing Google's claims with a blog post, insisting that, technically,"quantum supremacy" hadn't really been reached.

Quantum computers have special properties that allow them to solve problems exponentially faster than even the most powerful computers today. Google researchers said their quantum computer solved a problem in 200 seconds that would take a powerful supercomputer 10,000 years to solve a potential game changer for fighting climate change, discovering drugs, predicting the stock market, and cracking the toughest encryption.

Quantum computing is still in its infant stages, and you won't find it in your office anytime soon, but investors and researchers see huge potential in it. Already, companies like Google, IBM, Microsoft, and Intel are racing tobuild quantum computers, while venture capitalists are pouring money into startups like IonQ, Rigetti Computing, Aliro, and D-Wave.

The feud between IBM and Google is in many ways academic. But it also highlights the prominence and importance within the industry of a technology considered science fiction just a decade ago. As computing technology gets pushed to its limits, new technology like quantum computing has the potential to open entirely new markets and shake up the balance of powers in the tech industry.

And while Google and IBM are taking different approaches to quantum, the rival claims underscore the seriousness with which each company views the technology.

"Google is doing things as a research project," Brian Hopkins, the vice president and principal analyst at Forrester, told Business Insider. "IBM has a commercial strategy, pouring money in to get money out. They want to get to a point where quantum computers are powerful enough so people are willing to pay money to solve problems."

At the same time, rivals like Microsoft, Intel, and quantum-computing startups are lauding Google's experiment and see it as a good sign for quantum computing.

Jim Clarke, Intel's director of quantum hardware, with one of the company's quantum processors. Intel

"We're beginning to have a discussion that a quantum computer can do something that a supercomputer does not," Jim Clarke, the director of quantum hardware at Intel, told Business Insider. "It motivates us that we're on the right path. There's still a long way to go to get to a useful quantum computer. I think this is a positive step along the way."

Computer experts told Business Insider it would take time to prove whether Google did, in fact, reach this benchmark and whether IBM's disputes were correct.

IBM, which built Summit, the most powerful supercomputer, said the experiment could be run by a supercomputer in 2 1/2 days, as opposed to the 10,000 years Google said would be required with a traditional computing technology.

In other words, even though Google's quantum computer is faster, if it were true that the supercomputer could run that same problem in 2 1/2 days, it would not be that large of a difference. Running a problem that takes 10,000 years to solve is impractical, but if it took 2 1/2 days to solve, it would not be that big of a deal.

"The conflict between Google and IBM highlights that there's some ambiguity in the definition of quantum supremacy," Bill Fefferman, an assistant professor of computer science at the University of Chicago, told Business Insider.

Still, Google's work shows the progress of quantum computing, and people shouldn't lose sight of that, despite the arguments about it, Martin Reynolds, the distinguished vice president at Gartner, said.

That being said, since quantum computing is still in its early days, Google's milestone is "a bit like being the record holder in the 3-yard sprint," Reynolds said.

Fefferman added that the "jury is still out" on whether Google has actually reached quantum supremacy, but not because of anything IBM said.

"While it's not completely clear to me that there's currently enough evidence to conclude that we've reached quantum supremacy, Google is certainly breaking new ground and going places people have not gone before," Fefferman said.

And though Google's experiment is a "major scientific breakthrough," it has little influence on commercial users today, Matthew Brisse, the research vice president at Gartner, said.

"It demonstrates progress in the quantum community, but from an end-user perspective, it doesn't change anyone's plans or anyone's project initiatives because we're still many years away," Brisse told Business Insider. "We're literally five to 10 years away from using this in a commercial production environment."

In general, IBM and Google's competitors told Business Insider they saw the experiment as a step forward.

"This is an exciting scientific achievement for the quantum industry and another step on a long journey towards a scalable, viable quantum future," a Microsoft spokesperson said in a statement.

Rigetti Computing CEO Chad Rigetti. YouTube/Y Combinator

Chad Rigetti, the founder and CEO of the startup Rigetti Quantum Computing, called Google's experiment a "remarkable achievement" that should give researchers, policymakers, investors, and other users more confidence in quantum computing.

He added that IBM's claims haven't been tested on actual hardware yet, and even if it were proved, it would still be slower and more expensive to run than on Google's quantum computer.

"The Google experiment is a landmark scientific achievement and the most important milestone to date in quantum computing," Rigetti told Business Insider. "It shows that real commercial applications are now within sight for superconducting qubit systems."

Clarke, of Intel, agreed that it was a positive for the quantum community overall, though he said that calling it "quantum supremacy" might be debatable. Clarke also said that it could show that quantum computers could be more efficient, as he suspects that Google's quantum computer uses much less power than running a Summit supercomputer for over two days.

"What's been interesting to me is seeing some of the negative reactions to this announcement," Clarke told Business Insider. "If you're in the quantum community, any good experiment that suggests there's a long future in quantum computing should be appreciated. I haven't quite understood some of the negative response at this point."

What happens next is that other scientists will review the paper, work to prove or disprove it, and debate whether quantum supremacy has been reached. Ines Montano, an associate professor of applied physics at Northern Arizona University, said IBM would likely work to prove that its supercomputer could run that experiment in a shorter time frame.

"IBM will have to figure out something to put some data to their claim," Montano told Business Insider. "That will be a very public discussion for a while. In the meantime, there's the quest is to find problems that may be more applicable to current things ... We're not as far away as we were thinking 10 years ago."

This will likely take some time, as quantum supremacy is difficult to prove. Still, quantum computing is still in its early stages, experts say, and they expect more advancements in the coming years. Experts predict that the industry is still at least 10 years away from useful quantum computers.

"Google's managed to find a complex problem that they can solve on this system," Reynolds told Business Insider. "It isn't a useful solution, but it is a big step forwards. IBM offers a way to solve the problem with classical hardware in a couple of days. That's also impressive and shows the caliber of thinking that we find in these early quantum programs."

View post:

IBM picked a fight with Google over its claims of 'quantum supremacy.' Here's why experts say the feud could shake up the tech industry's balance of...

What Are the Biggest Challenges Technology Must Overcome in the Next 10 Years? – Gizmodo

Technologys fineI definitely like texting, and some of the shows on Netflix are tolerablebut the fields got some serious kinks to work out. Some of these are hardware-related: when, for instance, will quantum computing become practical? Others are of more immediate concern. Is there some way to stop latently homicidal weirdos from getting radicalized online? Can social networks be tweaked in such a way as to not nearly guarantee the outbreak of the second Civil War? As AI advances and proliferates, how can we stop it from perpetuating, or worsening, injustice and discrimination?

For this weeks Giz Asks, weve assembled a wide-ranging panelof futurists, engineers, anthropologists, and experts in privacy and AIto address these and many other hurdles.

Professor of Electrical Engineering and Computer Science and Director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT

Here are some broad societal impact challenges for AI. There are so many important and exciting challenges in front of upI include a few I have been thinking about:

1) virtual 1-1 student-teacher ratios for all childrenthis will enable personalized education and growth for all children

2) individualized healthcarethis will deliver medical attention to patients that is customized to their own bodies

3) reversing climate changethis will take us beyond mapping climate change into identifying ways to repair the damage; one example is to reverse engineer photosynthesis and incorporate such processes into smart cities to ameliorate pollution

4) interspecies communicationthis will enable us to understand and communicate with members of other species, for example to understand what whales are communicating through their song, etc

5) intelligent clothing that will monitor our bodies (1) to ensure we live well and (2) to detect the emergence of a disease before the disease happens

And here are some technical challenges:

1) interpretability and explainability of machine learning systems

2) robustness of machine learning systems

3) learning from small data

4) symbolic decision making with provable guarantees

5) generalizability

7) machine learning with provable guarantees

8) unsupervised machine learning

9) new models of machine learning that are closer to nature

Anthropologist and Research Director at the Centre National de la Recherche Scientifique, Institut Jean Nicod, Paris; Co-Founder of the Centre for the Resolution of Intractable Conflict, University of Oxford, and author of Talking to the Enemy: Faith, Brotherhood and the (Un)Making of Terrorists

How to tell the difference between real vs fake, and between good vs harmful so that we can prevent harmful fake (malign) activity and promote what is real and good?

Malign social media ecologies (hate speech, disinformation, polarizing and radicalizing campaigns, etc.) have both bottom-up and top-down aspects, each of which is difficult to deal but together stump most counter efforts. These problems are severely compounded by exploitation of cognitive biases (e.g., their tendency to believe in messages that conform to ones prior believes and to disbelieve messages that dont), and also by exploitation of cultural belief systems (e.g., gaining trust, as in the West, based on accuracy, objectivity, validation and competence vs. gaining trust, as in most of the rest of the world, based on respect, recognition, honor, and dignity) and preferences (e.g., values associated with family, communitarian, nationalist, traditional mores vs. universal, multicultural, consensual, progressive values).

Malign campaigns exploit psychological biases and political vulnerabilities in the socio-cultural landscape of nations, and among transnational and substate actors, which has already led to new ways of resisting, reinforcing and remaking political authority and alliances. Such campaigns also can be powerful force multipliers for kinetic warfare and affect economies. Although pioneered by state actors, disinformation tools are now readily available to anyone or any group with internet access to deploy at low cost. This democratization of influence operations, coupled with democracies vulnerabilities owing to political tolerance and free speech, requires our societies to create new forms of resilience as well as deterrence. This means that a significant portion of malign campaigns involve self-organizing bottom-up phenomena that self-repair. Policing and banning on any single platform (Twitter, Facebook, Instagram, VKontake, etc.) can be downright counterproductive, with users going to back doors even being banned, jumping between countries, continents and languages, and eventually producing global dark pools, in which illicit and malign online behaviors will flourish.

Because large clusters that carry hate speech or disinformation arise from small, organic clusters, it follows that large clusters can hence be reduced by first banning small clusters. In addition, random banning of a small fraction of the entire user population (say, 10 percent) would serve the dual role of lowering the risk of banning many from the same cluster, and inciting a large crowd. But if, indeed, States and criminal organizations with deep offline presence can create small clusters almost at will, then the problem becomes not one of simply banning small clusters or a small fraction of randomly chosen individuals. Rather, the key involves identifying small clusters that initiate a viral cascade propagating hate or malign influence. Information cascades follow a heavy-tailed distribution, with large-scale information cascades relatively rare (only 2 percent > 100 re-shares), with 50 percent of shares in a cascade occurring within an hour; so the problem is to find an appropriate strategy i to identify an incipient malign viral cascade and apply counter measures well within the first hour

There is also a layering strategy evident in State-sponsored and criminally-organized illicit online networks. Layering is a technique where links to disinformation sources are embedded in popular blogs, forums and websites of activists (e.g., environment, guns, healthcare, immigration, etc.) and enthusiasts (e.g., automobiles, music, sports, food and drink, etc.). These layering-networks, masquerading as alternative news and media sources, regularly seek bitcoin donations. Their block chains show contributions made by anonymous donors in orders of tens of thousands of dollars at a time, and hundreds of thousands of dollars over time. We find that these layering-networks often form clusters linking to the same Google Ad accounts, earning advertising dollars for their owners and operators. Social media and advertising companies often have difficulty identifying account owners linked with illicit and malign activity, in part because they often appear to be organic and regularly pass messages containing a kernel of truth. How, then, to detect layering-networks (Breitbart, One America News Network, etc.), symbols (logos, flags), faces (politicians, leaders), suspicious objects (weapons), hate speech and anti-democracy framing & as suspicious?

Finally, knowledge of psychology and cultural belief systems are needed to train the data that technology uses to mine, monitor, and manipulate information. Overcoming malign social media campaigns ultimately relies on human appraisal of strategic aspects, such as importance of core values and the stakes at play (political, social, economic), and relative strengths of players in those stakes. The critical role of social science goes beyond the expertise of engineers, analysts, and data scientists that platforms like Twitter, Instagram, and Facebook use to moderate propaganda, disinformation, and hateful content.

Yet, an acute problem concerns overwhelming evidence from cognitive and social psychology and anthropology, that truth and evidenceno matter how logically consistent or factually correctdo not sway public opinion or popular allegiance as much as appeals to basic cognitive biases that confirm deep beliefs and core cultural values. Indeed, many so-called biases used in argument do not reflect sub-optimal or deficient reasoning but rather suggest their efficient (even optimal) use for persuasionan evolutionarily privileged form of reasoning to socially recruit others to ones circle of beliefs for cooperation and mutual defense. Thus, to combat false or faulty reasoningas in noxious messagingits not enough to target an arguments empirical and logical deficiencies versus a counterarguments logical and empirical coherence. Moreover, recent evidence suggests that warning about misinformation has little effect (e.g., despite advanced warning, yes voters are more likely than no voters to remember a fabricated scandal about a vote no campaign, and no voters are more likely to remember a fabricated scandal about a vote yes campaign). Evidence is also mounting that value-driven, morally focused information in general, and social media in particular not only drives readiness to believe, but also concerted actions for beliefs.

One counter strategy involves compromising ones own truth and honesty, and ultimately moral legitimacy, in a disinformation arms race. Another is to remain true to the democratic values upon which our society is based (in principle if not practice), never denying or contradicting them, or threatening to impose them on others.

But how to consistently expose misleading, false, and malicious information while advancing truthful, evidence-based information that never contradicts our core values or threatens the core values of others (to the extent tolerable)? How to encourage people to exit echo chambers of the like-minded to engage in a free and open public deliberation on ideas that challenge preconceived or fed attitudes, a broader awareness of what is on offer and susceptibility to alternatives may be gained however initially strong ones preconception or fed history?

Professor, Mechanical Engineering, MIT, whose research focuses on quantum information and control theory

The two greatest technological challenges of our current time are

(a) good cellphone service, and

(b) a battery with the energy density of extra virgin olive oil

I need say no more about (a). For (b) I could have used diesel fuel instead of olive oil (they have similar energy densities), but I like the thought of giving my computer a squirt of extra virgin olive oil every time it runs out of juice.

Since you are also interested in quantum computing Ill comment there too.

Quantum computing is at a particularly exciting and maybe scary moment. If we can build large-scale quantum computers, they would be highly useful for a variety of problems, from code-breaking (Shors algorithm), to drug discovery (quantum simulation), to machine learning (quantum computers could find patterns in data that cant be found by classical computers).

Over the past two decades, quantum computers have progressed from relatively feeble devices capable of performing a few hundred quantum logic operations on a few quantum bits, to devices with hundreds or thousands of qubits capable of performing thousands to tens of thousands of quantum ops.

That is, we are just at the stage where quantum computers may actually be able to do something useful. Will they do it? Or will the whole project fail?

The primary technological challenge over the next few years is to get complex superconducting quantum circuits or extended quantum systems such as ion traps or quantum optical devices to the point where they can be sufficiently precisely controlled to perform computations that classical computers cant. Although there are technological challenges of fabrication and control involved, there are well-defined paths and strategies for overcoming those challenges. In the longer run, to build scalable quantum computers will require devices with hundreds of thousands of physical qubits, capable of implementing quantum error correcting codes.

Here the technological challenges are daunting, and in my opinion, we do not yet possess a clear path to overcoming them.

Quantitative futurist, Founder of the Future Today Institute, Professor of Strategic Foresight at New York University Stern School of Business, and the author, most recently, of The Big Nine: How the Tech Titans and Their Thinking Could Warp Humanity

The short answer is this: We continue to create new technologies without actively planning for their downstream implications. Again and again, we prioritize short-term solutions that simply never address long-term risk. We are nowists. Were not engaged in strategic thinking about the future.

The best example of our collective nowist culture can be seen in the development of artificial intelligence. Weve prioritized speed over safety, and longer-term strategy over short-term commercial gains. But were not asking important questions, like what happens to society when we transfer power to a system built by a small group of people that is designed to make decisions for everyone? The answer isnt as simple as it may seem, because we now rely on just a few companies to investigate, develop, produce, sell, and maintain the technology we use each and every day. There is tremendous pressure for these companies to build practical and commercial applications for AI as quickly as possible. Paradoxically, systems intended to augment our work and optimize our personal lives are learning to make decisions that we, ourselves, wouldnt. In other caseslike warehouses and logisticsAI systems are doing much of the cognitive work on their own and relegating the physical labor to human workers.

There are new regulatory frameworks for AI being developed by the governments of the US, Canada, EU, Japan, China, and elsewhere. Agencies like the U.S.-based National Institute of Standards and Technology are working on technical standards for AI, but that isnt being done in concert with similar agencies in other countries. Meanwhile, China is forging ahead with various AI initiatives and partnerships that are linking together emerging markets around the world into a formidable global network. Universities arent making fast, meaningful changes to their curricula to address ethics, values and bias throughout all of the courses in their AI programs. Everyday people arent developing the digital street smarts needed to confront this new era of technology. So they are tempted to download fun-looking, but ultimately suspicious apps. Theyre unwittingly training machine learning systems. Too often, they are outright tricked into allowing others to access untold amounts of their social, location, financial, and biometric data.

This is a systemic problem, one that involves our governments, financiers, universities, tech companies and even you, dear Gizmodo readers. We must actively work to create better futures. That will only happen through meaningful collaboration and a global coordination to shape AI in way that benefits companies and shareholders, but also prioritizes transparency, accountability and our personal data and privacy. The best way to engineer systematic change is to treat AI as a public good.

University Distinguished Professor, Chicago-Kent College of Law, Illinois Institute of Technology, whose work focuses on the impact of technologies on individuals, relationships, communities, and social institutions

Technologies from medicine to transportation to workplace tools are overwhelmingly designed by men and tested on men. Rather than being neutral, technologies developed with male-oriented specs can cause physical harm and financial risks to women. Pacemakers are unsuited to many women since womens hearts beat faster than mens and that was not figured into the design. Because only male crash test dummies were used in safety ratings until 2011, seat-belted women are 47% more likely to be seriously harmed in car accidents. When men and women visit help wanted websites, the technological algorithms direct men to higher-paying jobs. Machine learning algorithms designed to screen resumes so that companies can hire people like their current top workers erroneously discriminate against women when those current workers are men.

Womens hormones are different than mens, causing some drugs to have enhanced effects in women and some to have diminished effects. Even though 80% of medications are prescribed to women, drug research is still predominantly conducted on men. Between 1997 and 2000, the FDA pulled ten prescription drugs from the market, eight of which were recalled because of the health risks they posed to women.

On the other hand, some treatments may be beneficial to women, but never brought to market if the testing is done primarily on men. Lets say that a drug study enrolls 1000 people, 100 of whom are women. What if it offers no benefit to the 900 men, but all 100 women are cured? The researchers will abandon the drug, judging that it is only 10% effective. If a follow-up study focused on women, it could lead to a new drug to the benefit of women and the economy.

Workplace technologies also follow a male model. Female surgeons in even elite hospitals have to stack stools on top of one another to stand high enough to undertake laparoscopic surgeries. Their lesser hand strength causes them to have to use both hands to operate tools that male surgeons operate with one, leading female surgeons to have more back, neck and hand problems than men. Nonetheless, the patients of female surgeons do better than those of men. Imagine the health gain to the patients (and their female surgeons) if technologies were designed to accommodate women as well as men.

Female fighter pilots wear g-suits designed in the 1960s to fit men. These too-large suits do not provide adequate protection for women against g-forces, which can lead to a sudden loss of color vision or a full blackout as blood begins to rush from their brain. The zippers generally dont unzip far enough to comfortably fit the female bladder device, which causes some female pilots not to drink before missions, potentially leading to blackouts from dehydration. Other military equipment poses safety and efficacy risks to women. Designing with women in mindsuch as the current work on exoskeletonscan benefit both female and male soldiers by providing protection and increasing strength and endurance.

Id like to see the equivalent of a Moon Shota focused technology research programthat tackles the issue of women and technology. Innovation for and by women can grow the economy and create better products for everyone.

Do you have a question for Giz Asks? Email us at tipbox@gizmodo.com.

See the rest here:

What Are the Biggest Challenges Technology Must Overcome in the Next 10 Years? - Gizmodo

Quantum Computing: The Why and How – insideHPC

In this video from the Argonne Training Program on Extreme-Scale Computing 2019, Jonathan Baker from the University of Chicago presents: Quantum Computing: The Why and How.

The Argonne Training Program on Extreme-Scale Computing (ATPESC) provides intensive, two weeks of training on the key skills, approaches, and tools to design, implement, and execute computational science and engineering applications on current high-end computing systems and the leadership-class computing systems of the future. As a bridge to that future, this two-week program fills the gap that exists in the training computational scientists typically receive through formal education or other shorter courses. With around 70 participants accepted each year, admission to the ATPESC program is highly competitive. ATPESC is part of the Exascale Computing Project, a collaborative effort of the DOE Office of Science and the National Nuclear Security Administration.

Jonathan Baker is a second year Ph.D student at The University of Chicago advised by Fred Chong. He is studying quantum architectures, specifically how to map quantum algorithms more efficiently to near term devices. Additionally, he is interested in multivalued logic and taking advantage of quantum computings natural access to higher order states and using these states to make computation more efficient. Prior to beginning his Ph.D., he studied at the University of Notre Dame where he obtained a B.S. of Engineering in computer science and a B.S. in Chemistry and Mathematics.

Check out our insideHPC Events Calendar

The rest is here:

Quantum Computing: The Why and How - insideHPC

Editorial: Quantum computing is a competition we can’t afford to lose – The Winchester Star

We Americans have a habit of bragging about our feats of technology. Our chief economic and military rivals namely Russia and China seldom do. They prefer to keep their secrets.

No one in this country is certain, then, how far the state-controlled economies of those nations have gone in developing quantum computing.

What is certain is that our national security, both militarily and economically, demands that the United States be first to perfect the technology. The reason for that was demonstrated in an announcement Wednesday by technology giant Google.

Google officials claim to have achieved a breakthrough in quantum computing. They say they have developed an experimental quantum computing processor capable of completing a complex mathematical calculation in less than four minutes.

Google says it would take the most advanced conventional supercomputer in existence about 10,000 years to do that.

Wrap your mind around that, if you can.

Other companies working with quantum computing, including IBM, Intel and Microsoft, say Google is exaggerating. IBM researchers told The Associated Press the test calculation used by Google actually could be handled by certain supercomputers in two and one-half days.

Still, you get the idea: Quantum computing will give the nation including its armed forces and industries that gets there first an enormous advantage over everyone else. The possibilities, ranging from near-perfect missile defense systems to vastly accelerated research on curing diseases, are virtually endless.

U.S. officials are cognizant of the ramifications of quantum computing, to the point that Washington has allocated $1.2 billion to support research during the next five years.

If that is not enough to ensure the United States stays in the lead in the quantum computing race, more should be provided. This is a competition we cannot afford to lose.

Read the original:

Editorial: Quantum computing is a competition we can't afford to lose - The Winchester Star

Quantum investment soars in the UK to more than 1bn – Management Today

Whats very small but set to be very big? Quantum technology, according to the UK government, which took the decision in June to reinvest in a scheme designed to move the science beyond academia and research laboratories and into commercial and practical use.

Some 1bn has already been invested in the UKs National Quantum Technologies Programme, which was set up in 2013. The government recently announced a further 153m of funding through the Industrial Strategy Challenge Fund (which aims to ensure that 2.4 per cent of GDP is invested in R&D by 2027) plus 200m of investment from the private sector.

This means spending by industry is outstripping government investment for the first time, a good indication that the technology has stepped beyond an initial, broadly speculative stage. "Quantum is no longer an experimental science for the UK," says former science minister Chris Skidmore. "Investment by government and businesses is paying off as we become one of the worlds leading nations for quantum science and technologies."

Whereas "classical" computers are based on a structure of binary choices yes or no; on or off quantum computing is a lot more complicated. Classical chips rely on whether or not an electron is conducted from one atom to another around a circuit, but super-cooled quantum chips allow us to interface with the world at a much deeper level, taking into account properties such as superposition, entanglement or interference.

Confused? Think of a simple coin toss. Rather than being able to simply call heads or tails, superposition allows us to take into account when a coin spins, while entanglement is whether its properties are intrinsically linked with those of another coin.

To help harness this new potential in different areas, the governments programme works across four hubs: sensing and timing; imaging; computing and simulation; and communications.

One of the key advances that quantum computing is expected to bring is not just substantially greater processing speed but the ability to mimic and, therefore, understand and predict the ways that nature works.

For example, this could allow us to look directly inside the human body, see through smoke or mist, develop new drugs much more quickly and reliably by reviewing the effect on many molecules at the same time, or even make our traffic run smoothly. Meanwhile, the Met Office has already invested in this technology to improve weather forecasting.

Image: IBM Q System One quantum computer, photo by Misha Friedman/Getty Images

Read more from the original source:

Quantum investment soars in the UK to more than 1bn - Management Today

What one member of Trump’s new science advisory council wants it to tackle – Science Magazine

Dario Gil with IBMs System One quantum computer

By Jeffrey MervisNov. 1, 2019 , 2:10 PM

The Presidents Council of Advisors on Science and Technology (PCAST) has yet to hold its first meeting, and the White House hasnt even announced its full 16-person roster. But one newly appointed member, Director of IBM Research Dario Gil in Yorktown Heights, New York, already has a wish list of issues hed like it to tackle.

His list includes promoting scientific inquiry and its value to policymakers, ensuring that researchers have the computational tools they need in an era of big data, retraining the U.S. workforce to be more technically literate, and updating a partnership between the federal government, academia, and industry spelled out by Vannevar Bush at the end of World War II. Gil also thinksthe government must strike the right balance between protecting national security and fostering international scientific collaboration with the rest of the world, using a scalpel instead of a blanket policy to monitor and prevent undue foreign influences on U.S. research.

I am passionate about the need for continued investment in science, says Gil, 43, who joined IBM immediately after earning his Ph.D. in 2003 and has been rising quickly through its management ranks. I want to be an advocate of its critical importance.

That full-throated endorsement of the U.S. research enterprise may hearten some scientists who feel President Donald Trump and his administration have been scornful of their contributions and the role of science in policymaking. Gil declined to characterize his political leanings and deflected a question about whether he agrees with that criticism.

Henry Smith, a professor emeritus of electrical engineering at the Massachusetts Institute of Technology in Cambridge and Gils graduate adviser, thinks Trump has been a disaster for scienceand that Gil shares those views. However, Smith says Gil also understands that tact is important in trying to influence government policy.

I think that Dario will express his views as clearly and diplomatically as possible, says Smith, who has remained in touch with Gil. He knows that if he goes in and says something too radical, it would be ignored. But that doesnt mean hes not going to stand up for what he believes.

PCAST will be chaired by Kelvin Droegemeier, the presidents science adviser, who filled a 2-year vacancy when he became director of the White House Office of Science and Technology Policyin January. Gil spoke with ScienceInsider shortly after the White House announced PCASTs first cohort of seven scientists and industry leaders on 22 October. The interview has been edited for clarity and brevity.

Q: What haskept you at IBM?

A: Ive been having too much fun. When I graduated from [Smiths] nanostructures laboratory, I joined a group that was a natural extension of what I had been doing, pushing the limits of nanofabrication. But then I got involved in modeling, and in high-performance computing, and in applying models more broadly across different industries. And then I became responsible for all the core scientific research at IBM in the physical sciences. Then I led the AI [artificial intelligence] organization, and then I got really involved in quantum computing.

Q: What are some key issues in applying quantum science to real-world problems?

A:People are fascinated by the word quantum; theres something about the word itself and the underlying physics that fascinates people. Beyond that, they ask about the implications, and whether it will replace personal computing.

And I say, No, its not. Its bits, neurons, and qubits coming together.Its not about qubits replacing bits or taking over the world. So, lets not think about one replacing the other.

As for what it is good for, it will have a profound effect on how we discover new materials and how material science is practiced. In agriculture it could lead to a new generation of fertilizers with a totally different energy consumption to create them, or better batteries. And it has profound implications for encryption and cybersecurity, and about U.S. economic competitiveness. There are also implications for the workforce, and what we need to do to train a new generation of scientists working in this environment.

Q: Are you worried that government policies to protect research in the name of national security could go too far and hamper progress and international collaboration?

A: I think that balance in that equation is indispensable. Science itself is very open endeavor, and its very important to maintain an open attitude toward how we do basic science.

Now, as we start adding the words technology and products, it is reasonable to discuss, for each technology, the right balance among capability, national security, openness, and so on. That is not new. Weve been doing that as a country for many decades. And AI and quantum will be no different. Were going to have to find that balance. And we need multiple voices to get that right.

Casting a broad brush to include everything does not make for sophisticated policy. So, I am in favor of being more nuanced, and more precise, in each area, and approaching it with a scalpel, not with a blanket policy.

Q: Does the government need to do more to strengthen science, technology, engineering, and matheducation and improve diversity? And what have you seen that works?

A:There are two categories that we need to pay more attention to. One is formal education, and the other is once people join the workforce.

If you look at formal education, there have been some trends that have been really disturbing. For example, if you look at the percentage of women studying computer science, we are worse off than we were 35 years ago. So that is really sad, and we have to be able to reverse that trend.

And then once youve done with formal education and you enter the workforce, theres very little continuing education to cope with technological shifts. We need to pay more attention to the mechanism for invest in acquiring those new technological skills. How are we going to do it, and what are the incentives, and what is the role of the private sector and academia?

Go here to read the rest:

What one member of Trump's new science advisory council wants it to tackle - Science Magazine

Riding the Third Wave of AI without Quantum Computing – UC San Diego Health

Rapid changes are occurring in the field of artificial intelligence (AI) as many computer scientists explore new ways to make systems faster and more efficient. One anticipated capability is quantum computingtechnology that follows the laws of quantum physics, enabling processing power to exist in multiple states and perform multiple tasks at the same time. If realized in hardware, it would speed-up some computational problem-solving exponentially. UC San Diego Theoretical Physicist Max Di Ventra is catching this next wave of cutting-edge AI with an alternative and fundamentally different platform he calls memcomputing, which doesn't require quantum capabilities.

Sketch of a memcomputing architecture. Apart from the input/output and a control unit, which directs the machine on what problem to solve, all computation is done by a memory unit, a computational memory. From F.L. Traversa and M. Di Ventra, IEEE Trans. Neural Networks Learn. Sys. 26, 2702 (2015). 2015 IEEE.

Using a physics-based approach, this novel computing paradigm employs memory to both process and store information on the same physical location, a property that somewhat mimics the computational principles of the human brain, said the UC San Diego physics professor and author of The Scientific Method: Reflections from a Practitioner (Oxford University Press, 2018).

After years of trial and error, Di Ventra and his group developed all of the mathematics required for this new simple architecture, combining memory and compute anddriven by a specialized computational memory unit, with performance that resembles quantum computingwithout the overwhelming computational overhead. Now, with half-a-million dollars over 18 months from the Defense Advanced Research Projects Agency (DARPA), Di Ventra and his students are working to apply this new physics-based approach to AI.

Our project, if successful, would have a large impact in the field of machine learning and artificial intelligence by showing that physics approaches can be of great help in fields of research that are traditionally dominated by computer scientists, said Di Ventra.

With the DARPA funds, the team will apply memcomputing to the unsupervised learning, or pre-training, of Deep Belief Networks. These are systems of multi-layer neural networks (NNs) used to recognize, generate and group data. DiVentra will also propose a hardware architecture, using current technologies, to perform this task. Pre-training of NNs is a notoriously difficult problem, and researchers have all but abandoned it in favor of supervised learning. However, in order to have machines that adapt to external stimuli in real time and make decisions according to the context in which they operatethe goal of the third wave of AIpowerful new methods to train NNs in an unsupervised manner are required.

Demonstration that a memcomputing solver (named Falcon in the figure) outperforms, by orders of magnitude, state-of-the-art algorithms in solving difficult computational problems. From F. Sheldon, P. Cicotti, F.L. Traversa and M. Di Ventra, IEEE Trans. Neural Networks Learn. Sys. (2019). 2019 IEEE.

Di Ventra explained that memcomputing accelerates the time to find feasible solutions to the most complex optimization problems in all industries.

We have applied these emulations to a wide variety of difficult computational problems that are of interest to both academia and industry, and solved them orders of magnitude faster than traditional algorithms, noted Di Ventra.

Unlike quantum computing, memcomputing employs non-quantum units so it can be realized in hardware with available technology and emulated in software on traditional computers. Current computing capabilities began with the work of Alan Turing, who helped decrypt German codes during WWII with his Bombe Machine. He also developed the Turing Machine, which became the basis for modern computers. John von Neumann devised the architecture for the Turing Machine, whereby the central processing unit (CPU) was separate from the memory unit. The so-called von Neumann Bottleneck in todays computing is created precisely from the physical separation of the CPU and the memory unit: the CPU has to constantly insert and extract information from the memory, significantly slowing processing time.

Memcomputing represents a radical departure from both our traditional computers, and algorithms that run on them, and quantum computers, said Di Ventra. It provides the necessary tools for the realization of an adaptable computational platform deployable in the field of artificial intelligence and offers strategic advantages to the Department of Defense in numerous applications, said Di Ventra.

In view of the preliminary successes of memcomputing, Di Ventra has co-founded the company MemComputing, Inc., whichis developing a software as a service, based on this technology, to solve the most challenging problems in academia and industry.

UC San Diegos Studio Ten 300 offers radio and television connections for media interviews with our faculty. For more information, email .(JavaScript must be enabled to view this email address).

Read more from the original source:

Riding the Third Wave of AI without Quantum Computing - UC San Diego Health

Airbus announces the names of the jury members for its Quantum Computing Challenge – Quantaneo, the Quantum Computing Source

A jury of world-leading quantum computing experts is teaming up with Airbus to evaluate submitted proposals to the Airbus Quantum Computing Challenge. Challenge winners will be announced at the beginning of 2020.

Four decades ago, quantum computing was a little-known, obscure theory relating to classical representations of computational memory. Today, it is a red-hot topic in the tech world, as major digital players such as Intel, Google, IBM and Microsoft invest massive sums to push the technology forward. At the same time, academic centres of excellence have been popping up worldwide, demonstrating how ideas, talent and investment have been flowing from multiple directions.

At Airbus, quantum computing has been identified as a potential game-changing future technology for aerospace. Launched in 2019, the Airbus Quantum Computing Challenge aims to challenge experts and enthusiasts in the field to tackle complex aerospace computational problems.

To help evaluate the submitted proposals, Airbus is bringing together top-notch international quantum computing experts to serve as jury members. The experts reflect a diverse array of academics and industry professionals, from computer scientists to founders of start-ups. Each expert has significant and deep experience in quantum computing, and a level of expertise that is recognised on an international scale. The jury is tasked with assessing the submitted proposals to identify the winners of the first edition of the challenge.

The deadline for submissions is October 31, 2019.

Meet the jury of quantum computing experts

Harry Buhrman QuSoft / University of Amsterdam (Netherlands) Harry Buhrman is a computer scientist, and professor of algorithms, complexity theory and quantum computing at the University of Amsterdam. He is group leader of the Quantum Computing Group at the Centre for Mathematics and Informatics (CWI), as well as co-founder and executive director of QuSoft, a research centre dedicated to quantum software.

Wim van Dam QC Ware & University of California (Palo Alto, USA) Wim van Dam is a quantum computer scientist with expertise in developing and analysing quantum algorithms that significantly outperform classical algorithms. He is also a computer science and physics professor at the University of California, Santa Barbara. At QC Ware, he oversees the design and development of quantum algorithms for customer applications in optimisation and finance.

Joe Fitzsimons Horizons Quantum Computing (Singapore) Joe Fitzsimons is the founder and CEO of Horizon Quantum Computing, a venture-backed start-up focused on automatic synthesis of quantum algorithms. Prior to this role, he was a principal investigator at the Centre for Quantum Technologies at the National University of Singapore.

Elham Kashefi Sorbonne University & University of Edinburgh (Paris, France / Edinburgh, Scotland) Elham Kashefi is a research director at Sorbonne Universitys CNRS LIP6, and a quantum computing professor at the University of Edinburghs School of Informatics. She is also the associate director of the EPSRC Networked Quantum Information Technologies Hub and co-founder of the quantum tech start-up VeriQLoud.

Iordanis Kerenidis QC Ware / CNRS / Paris Centre for Quantum Computing (Palo Alto, USA / Paris, France) Iordanis Kerenidis is a research director at CNRS and the Director of the Paris Centre for Quantum Computing. He focuses on designing quantum algorithms for machine learning and optimisation with provable speed-ups. At QC Ware, he works on overseeing prototype development and algorithmic design for customers.

Michele Mosca University of Waterloo (Canada) Michele Mosca is the co-founder of the University of Waterloos Institute for Quantum Computing and a founding member of the Perimeter Institute for Theoretical Physics. He co-founded evolutionQ Inc., a start-up that supports organisations as they evolve their quantum-vulnerable systems to quantum-safe ones.

Troy Lee University of Technology Sydney (Australia) Troy Lee is an associate professor at the University of Technology Sydneys Centre for Quantum Software and Information. His research focuses on quantum algorithms, the limitations of quantum computers and complexity theory.

Jingbo Wang University of Western Australia Jingbo Wang leads an active research group at the University of Western Australia (UWA) in the area of quantum simulation, quantum walks and quantum algorithm development. At UWA, she is also the Head of the Physics Department and chair of a cross-disciplinary research cluster named Quantum information, simulation and algorithms.

See the article here:

Airbus announces the names of the jury members for its Quantum Computing Challenge - Quantaneo, the Quantum Computing Source

IT sees the Emergence of Quantum Computing as a Looming Threat to Keeping Valuable Information Confidential – Quantaneo, the Quantum Computing Source

A new study from DigiCert, Inc., the world's leading provider of TLS/SSL, IoT and PKI solutions, reveals that 71 percent of global organizations see the emergence of quantum computers as a large threat to security. Most anticipate tangible quantum computer threats will begin arriving within three years. The survey was conducted by ReRez Research in August 2019, within 400 enterprise organizations in the U.S., Germany and Japan from across critical infrastructure industries.

Quantum Computing Threat is Real and Quickly Approaching

Quantum computing is on the minds of many and is impacting their current and future thinking. Slightly more than half (55 percent) of respondents say quantum computing is a "somewhat" to "extremely" large security threat today, with 71 percent saying it will be a "somewhat" to "extremely" large threat in the future. The median prediction for when PQC would be required to combat the security threat posed by quantum computers was 2022, which means the time needed to prepare for quantum threats is nearer than some analysts have predicted.

Top Challenges

With the threat so clearly felt, 83 percent of respondents say it is important for IT to learn about quantum-safe security practices. Following are the top three worries reported for implementing PQC:

High costs to battle and mitigate quantum threats Data stolen today is safe if encrypted, but quantum attacks will make this data vulnerable in the future Encryption on devices and applications embedded in products will be susceptible 95 percent of respondents reported they are discussing at least one tactic to prepare for quantum computing, but two in five see this is as a difficult challenge. The top challenges reported include:

Cost Lack of staff knowledge Worries that TLS vendors won't have upgraded certificates in time "It is encouraging to see that so many companies understand the risk and challenges that quantum computing poses to enterprise encryption," said Tim Hollebeek, Industry and Standards Technical Strategist at DigiCert. "With the excitement and potential of quantum technologies to impact our world, it's clear that security professionals are at least somewhat aware of the threats that quantum computers pose to encryption and security in the future. With so many engaged, but lacking good information about what to do and how to prepare, now is the time for companies to invest in strategies and solutions that will help them get ahead of the game and not get caught with their data exposed when the threats emerge."

Preparing for PQC

Enterprises are beginning to prepare for quantum computing, with a third reporting they have a PQC budget and another 56 percent working on establishing a PQC budget. In terms of specific activities, not surprisingly, "monitoring" was the top tactic currently employed by IT. Understanding their organization's level of crypto-agility came next. This reflects the understanding that when the time comes to make a switch to PQC certificates, enterprises need to be ready to make the switch quickly and efficiently.

Rounding out the top five current IT tactics were understanding the organization's current level of risk, building knowledge about PQC and developing TLS best practices.

Recommendations

The DigiCert 2019 Post Quantum Crypto Survey points to three best practices for companies ready to start planning their strategies for securing their organizations for the quantum future:

Know your risk and establish a quantum crypto maturity model. Understand the importance of crypto-agility in your organization and establish it as a core practice. Work with leading vendors to establish digital certificate best practices and ensure they are tracking PQC industry progress to help you stay ahead of the curve, including with their products and solutions. Change rarely happens quickly, so it's better not to wait, but to address your crypto-agility now. For more information and to get the full report:https://www.digicert.com/resources/industry-report/2019-Post-Quantum-Crypto-Survey.pdf

The rest is here:

IT sees the Emergence of Quantum Computing as a Looming Threat to Keeping Valuable Information Confidential - Quantaneo, the Quantum Computing Source


12345...