Microsoft CEO says Azure Quantum will address the big challenges in computing – GeekWire

Microsoft CEO Satya Nadella introduces the companys new initiatives in quantum computing at the Microsoft Ignite conference in Orlando, Fla. (Microsoft Video)

Microsoft CEO Satya Nadella today took the wraps off Azure Quantum, a full-stack, cloud-based approach to quantum computing that he said would play well with traditional computational architectures.

With all the capacity we have around computing, we still have many unsolved problems, whether its around food safety, or climate change, or the energy transition, Nadella said at the Microsoft Ignite conference in Orlando, Fla. These are big challenges that need more computing. We need general-purpose quantum.

While classical computers deal in binary bits of ones and zeroes, quantum computers can take advantage of spooky physics to process quantum bits or qubits that can represent multiple values simultaneously.

For years, Microsoft and its rivals have been laying the groundwork for general-purpose quantum computing hardware and software. Microsoft has previously announced some elements of its strategy, including its Q# programming language and Quantum Development Kit, but today Nadella put all the pieces together.

A private preview of Azure Quantum is due to be launched in the coming months, with the Microsoft Quantum Network serving as the primary point of contact for developers and startups. Well have a variety of hardware solutions that are all going to be open in Azure, Nadella said.

Microsofts hardware partners include IonQ and Honeywell, which are working on quantum computing systems based on trapped ions; as well as Quantum Circuits Inc., which uses Lego-like assemblies of superconducting circuits.

Nadella said Azure Quantum will offer a complete toolkit of open-source software including Microsofts Q# and QDK, as well as 1QBits software platform and services.

End-to-end quantum computing based on Microsofts topological qubit architecture may not yet be ready for prime time, but Nadella highlighted a quantum on classical approach, in which quantum tools are used alongside classical computation to optimize the algorithms for simulating complex phenomena.

Weve seen, in fact, many use cases already, across health care, across finance and the electrical grid as well, Nadella said.

He threw a video spotlight on a medical diagnostic technique called magnetic resonance fingerprinting, which is being pioneered by Case Western Reserve University and the Cleveland Clinic.

The technique uses quantum-inspired algorithms to optimize MRI scans, based on the patients precise position inside the scanner. Once the scan is done, the 3-D visualization can be viewed using Microsofts HoloLens augmented-reality headset.

Working with Azure has given us improvement in speed and about a 30% improvement in precision, Mark Griswold, a professor of radiology at CWRU, said on the video. The results were getting are allowing us to see diseases earlier than befor, and to quantify the treatments that were giving.

Other early users include:

Microsoft isnt alone in efforts to explore the frontiers of quantum computing. D-Wave Systems, which is headquartered in Burnaby, B.C., has been developing a cloud-based service that takes advantage of a special-purpose optimization technology known as quantum annealing.

Meanwhile, IBM, Google and other heavyweights of the computer industry are neck-and-neck with Microsoft in the race to create general-purpose quantum devices. Just a couple of weeks ago, Google researchers and their partners published a research paper claiming that they had achieved quantum supremacy over classical computation for a specific algorithm that generates random numbers.

See the article here:

Microsoft CEO says Azure Quantum will address the big challenges in computing - GeekWire

Quantum computers: why Google, NASA and others are putting their chips on these dream machines – World Economic Forum

In 1936, Alan Turing proposed the Turing machine, which became the foundational reference point for theories about computing and computers. Around the same time, Konrad Zuse invented the Z1 computer, considered to be the first electromagnetic binary computer.

What happened next is history, and in our world today, computers are everywhere. Our lives are dramatically different from how they were even at the end of the 20th century, and our mobile phones have far more powerful CPUs than desktop computers did only few years ago. The advent of the Internet of Things brings computer power into every minute detail of our lives. The world wide web has had such a transformative effect on society that many people can't even remember a life before they were online.

The major catalyst behind this transformation was the discovery of silicon, and its use in the production of good transistors. This occurred over a period of more than 100 years, dating from when Michael Faraday first recorded the semiconductor effect in 1833, via Morris Tanenbaum, who built the first silicon transistor at Bell Labs in 1954, to the first integrated circuit in 1960.

We are about to embark on a similar journey in our quest for building the next-generation computer. Quantum physics, which emerged in the early 20th century, is so powerful and yet so unlike anything known before that even the inventors had a hard time understanding it in detail.

In the early 1980s, Richard Feynman, Paul Benioff and Yuri Manin provided the groundwork for a completely new paradigm of quantum computing, introducing the idea that quantum computing had the potential to solve problems that classical computing could not. And so quantum computing came into its own.

Peter Shor published an algorithm in 1994 capable of efficiently solving problems in cryptography that are hard to solve for classical computers that is, the vast majority of computers used today. In fact, Shor's algorithm continues to threaten the fundaments of most encryption deployed across the globe.

The problem was that, in 1994, there was no quantum computer in sight. In 1997, the first tiny quantum computer was built, but the field really took off only when the Canadian startup D-Wave revealed its 28-qubit quantum computer in 2007.

Similar to the trajectory of non-quantum communication, which took more than 100 years from discovery to mass use, quantum computers are now maturing very quickly. Today, many players are engaged in a battle over who can build the first powerful quantum computer. These include commercial entities such as IonQ, Rigetti, IBM, Google, Alibaba, Microsoft and Intel, while virtually all major nation states are spending billions of dollars on quantum computing development and research.

Quantum computers are powerful yet so difficult to build that whoever can crack the code will have a lasting powerful advantage. This cannot be understated. Heres a striking example of the power of quantum computing.

Quantum leaps: growth over the years

Image: Statista

To break a widely used RSA 2048-bit encryption, a classical computer with one trillion operations per second would need around 300 trillion years. This is such a long time that we all feel very safe.

A quantum computer using Shor's algorithm could achieve the same feat in just 10 seconds, with a modest 1 million operations per second. That's the power of quantum computers: 300 trillion years versus 10 seconds.

Another reason why nation states pour so much money into the field is precisely because, with it being so difficult, any achievement will directly yield a lasting advantage.

So where are quantum computers today, and where are they headed?

Considering the immense challenges to building quantum computers, I'd say we are roughly where we were in around 1970 with classical computers. We have some quantum computers, but they are still pretty unreliable compared to today's standard. We call them NISQ devices - Noisy Intermediate-Scale Quantum devices. Noisy because they are pretty bad, and intermediate-scale because of their small qubit number. But they work. There are a few public quantum computers available for anyone to programme on. IBM, Rigetti, Google and IonQ all provide public access with open-source tools to real quantum computing hardware. IBM even sells a quantum computer that you can put in your own data centre (the IBM Q System One).

But these are not yet powerful enough to break RSA 2048-bit keys, and probably won't be for another 10 to 20 years.

The comparison date of 1970 works from another angle, too. In October 1969, researchers sent the first message over the internet (it was called ARPANET then). When they tried to send the one word "login", the system crashed after sending "l" and "o". It later recovered and the message was successfully sent.

Today, we are also building a quantum communication system that doesn't communicate bits and bytes, but quantum states that quantum computers can understand. This is important so that we can build up a quantum version of the internet.

D-Wave, NASA, Google and the Universities Space Research Association created the D-Wave 1,097-qubit quantum computer.

Image: Reuters/Stephen Lam

It is also important as a way of encrypting communication, since the quantum channel provides some inherent physical guarantees about a transmission. Without going into too much detail, there is a fundamental property whereby the simple act of wiretapping or listening into a communication will be made detectable to the parties communicating. Not because they have a fancy system setup, but because of fundamental properties of the quantum channel.

But quantum computers are not just useful for cryptography applications and communication. One of the most immediate applications is in machine-learning, where we are already today on the cusp of a quantum advantage meaning that the quantum algorithm will outperform any classical algorithm. It is believed that quantum advantage for machine-learning can be achieved within the next 6-12 months. The near-term applications for quantum computing are endless: cryptography, machine-learning, chemistry, optimization, communication and many more. And this is just the start, with research increasingly extending to other areas.

Google and NASA have just announced that they have achieved 'quantum supremacy'. That is the ability of quantum computers to perform certain tasks that a classical computer simply cannot do in a reasonable timeframe. Their quantum computer solved a problem in 200 seconds that would take the worlds fastest supercomputer 10,000 years.

The problem that was solved is without any practical merits or implications, yet it demonstrates the huge potential quantum computers have and the ability to unlock that potential in the coming years.

This opens up a completely new era where we can now focus on building quantum computers with practical benefits and while this will still be many years away, it will be the new frontier in computation.

License and Republishing

World Economic Forum articles may be republished in accordance with our Terms of Use.

Written by

Andreas Baumhof, Vice President Quantum Technologies, QuintessenceLabs

The views expressed in this article are those of the author alone and not the World Economic Forum.

See the rest here:

Quantum computers: why Google, NASA and others are putting their chips on these dream machines - World Economic Forum

Other voices: Welcome to the age of Quantum computing – St. Paul Pioneer Press

Has the era of quantum computing finally dawned? In a field long plagued by hype and hubris, theres reason for some cautious optimism.

A team of scientists at Googles research lab announced last week in the journal Nature that they had built a quantum computer that could perform calculations in about 200 seconds that would take a classical supercomputer some 10,000 years to do. An age of quantum supremacy was duly declared.

Rather uncharitably, IBM researchers were quick to point out that the feat was less than advertised. They estimated that by using all of the hard disk space at the worlds most powerful classical computer, the Summit OLCF-4 at Oak Ridge National Laboratory, they could do the same calculation in 2.5 days, not 10,000 years. Googles claim to have achieved quantum supremacy that is, to have accomplished a task that traditional computers cant was premature.

This was to miss the bigger picture: A rudimentary quantum machine has improved on the fastest supercomputer ever built by a factor of 1,080 an immense achievement by any measure. Although the specific problem that Googles computer solved wont have much practical significance, simply getting the technology to work was a triumph; comparisons to the Wright brothers early flights arent far off the mark.

So is the world prepared for what comes next?

Quantum computers, to put it mildly, defy human intuition. They take advantage of the strange ways that matter behaves at the subatomic level to make calculations at extraordinary speed. In theory, they could one day lead to substantial advances in materials science, artificial intelligence, medicine, finance, communications, logistics and more. In all likelihood, no one has thought up the best uses for them yet.

They also pose some risks worth paying attention to. One is that the global race to master quantum computing is heating up, with unpredictable consequences. Last year, President Donald Trumps administration signed a $1.1 billion bill to prioritize the technology, which is a decent start. But the U.S. will need to do more to retain its global leadership. Congress should fund basic research at labs and universities, ensure the U.S. welcomes immigrants with relevant skills, invest in cutting-edge infrastructure, and use the governments vast leverage as a consumer to support promising quantum technologies.

A more distant worry is that advanced quantum computers could one day threaten the public-key cryptography that protects information across the digital world. Those systems are based on hard math problems that quantum computers might theoretically be able to crack with ease. Security researchers are well aware of the problem, and at work on creating post-quantum systems and standards. But vigilance and serious investment is nonetheless called for.

No doubt, the quantum-computing era will have its share of false starts, dashed hopes and fiendishly difficult problems to overcome. As Google is showing, though, thats how technology advances: bit by bit, into a very strange future.

Bloomberg Opinion

Read this article:

Other voices: Welcome to the age of Quantum computing - St. Paul Pioneer Press

Opinion | Quantum supremacy and the cat thats neither alive nor dead – Livemint

There is this joke about a cat that belonged to a gentleman called Schrdinger: Schrdingers cat walks into a bar. And doesnt."

If you chuckled, you must have been a student of quantum physics. Austrian physicist Erwin Schrdingers Cat Theory is a paradox that explains the seeming contradiction between what we see with our naked eye and what quantum theory says actually is in its microscopic state. He used this to disprove something called the Copenhagen Interpretation" of quantum mechanics. This interpretation states that a particle exists in all states at once until observed". Schrdingers cat is in a box and could be alive or dead. But, till the box is opened, you wont know its state. This would mean that the cat could be both alive and dead at the same time.

Now, hold that thought while we leap from cats to computers. The ones that we use now follow the principles of a Turing machine. Here, information is encoded into bits (either 1s or 0s) and one can apply a series of operations (and, or, not) to those bits to perform any computation. A quantum computer is different, it uses qubits or the quantum analogue of bits. Now, jump back to the cat. Much like the feline in Schrdingers box, a qubit is not always 0 or 1, but can be both at the same time. Only at the end of the computation or when the box is opened, would you know which, but during the computation process, its exact state is indeterminate.

If this leaves you scratching your head, do not fret. In a 2017 Wall Street Journal interview, here is what Bill Gates said: I know a lot of physics and a lot of math. But the one place where they put up slides and it is hieroglyphics, its quantum." Even Einstein had some difficulty grasping the concept and famously dismissed it with, God does not play dice with the universe."

What makes a quantum computer exciting is its ability to exploit these properties of quantum physics to perform certain calculations far more efficiently and faster than any supercomputer. Thus, megacorps such as Microsoft, IBM, and Google have been working on quantum computers. Last week, Google claimed to have achieved quantum supremacy, or the point when such a computer can perform a calculation that a traditional one cannot complete within its lifetime. Googles quantum computer took 200 seconds for a calculation that would take a supercomputer 10,000 years.

While all this is impressive, what does it mean for us? Its hard to fully answer this, as we are venturing into an entirely new area, and the future will reveal applications we have not even imagined yet. Its a bit like classical computing. We did not know how it will totally revolutionize our world. In the same manner, quantum computing could be a game-changer for many industries.

Take big data and analytics. We produce 3 exabits of data every day, equivalent to 300,000 Libraries of Congress. Classical computers are reaching their limits of processing power. However, with exponentially more powerful quantum computers, we could spot unseen patterns in large data sets, integrate data from different data sets, and tackle the whole problem at once. This would be rocket fuel for artificial intelligence (AI), with quantum computing offering quick feedbacks and collapsing the learning curve of machines. This will make AI more intuitive, expand to various industries and help build artificial general intelligence.

Online security will be impacted, with our current data encryption strategies wilting under the assault of quantum power. On the other hand, there will be formidable new cryptographic methods like quantum key distribution, where even if the message gets intercepted, no one can read it (the Cat, again). On a side note, the security of every public blockchain will be under threat from quantum hacks. It was no coincidence that Bitcoins price slumped the day Google announced its breakthrough. Quantum computing could speed up drug development by reviewing multiple molecules simultaneously, quickly sequencing individual DNAs for personalized drugs. Another application lies in weather forecasting and, more importantly, climate-change predictions. It will require the tremendous power of quantum computing to create complex, ever-changing weather models to properly predict and respond to the climate cataclysm that awaits us.

Its a brave new world of quantum computing were entering, and we will discover its possibilities as we go along. If you feel youve got it but are still confused, thats okayit is the nature of this beast. Just step out of the box.

Jaspreet Bindra is a digital transformation and technology expert, and the author of the book The Tech Whisperer

Read more:

Opinion | Quantum supremacy and the cat thats neither alive nor dead - Livemint

Volkswagen : optimizing traffic flow with quantum computers – Quantaneo, the Quantum Computing Source

Volkswagen is launching in Lisbon the world's first pilot project for traffic optimization using a quantum computer. For this purpose, the Group is equipping MAN buses of the city of Lisbon with a traffic management system developed in-house. This system uses a D-Wave quantum computer and calculates the fastest route for each of the nine participating buses individually and almost in real-time. This way, passengers' travel times will be significantly reduced, even during peak traffic periods, and traffic flow will be improved. Volkswagen is testing its traffic optimization system during the WebSummit technology conference in Lisbon from November 4 to 8 - during the conference, buses will carry thousands of passengers through the city traffic in Lisbon.

Martin Hofmann, Volkswagen Group CIO, says: 'At Volkswagen, we want to further expand our expert knowledge in the field of quantum computing and to develop an in-depth understanding of the way this technology can be put to meaningful use within the company. Traffic optimization is one of the potential applications. Smart traffic management based on the performance capabilities of a quantum computer can provide effective support for cities and commuters.'

Vern Brownell, CEO of D-Wave, says: 'Volkswagen's use of quantum computing to tackle pervasive global problems like smart traffic management is an example of the real-world impact quantum applications will soon have on our cities, communities, and everyday lives. Since we built the first commercial quantum computer, D-Wave has been focused on designing systems that enable quantum application development and deliver business value. Volkswagen's pilot project is among the first that we know of to make production use of a quantum computer, and their ongoing innovation brings us closer than ever to realizing true, practical quantum computing.'

System includes two components: passenger number prediction and route optimization

The Volkswagen traffic management system includes two components - passenger number prediction and route optimization by quantum computing. For predictions, the development team from Volkswagen is using data analytics tools to identify stops with especially high passenger numbers at certain times. For this purpose, anonymized geo-coordinates and passenger flow data are used. The objective is to offer as many people as possible tailor-made transport possibilities and to ensure optimum utilization of the bus fleet.

For the pilot project in Lisbon, 26 stops were selected and connected to form four bus links. For example, one of these runs from the WebSummit conference facility to the Marqus de Pombal traffic node in the city center.

The Volkswagen team intends to continue the development of this prediction component. The idea is that bus operators should add temporary links to their scheduled services to serve stops with the largest passenger numbers. This would be a meaningful approach for major events in the city area, for example.

The Volkswagen experts have developed a quantum algorithm for route optimization between the stops. This algorithm calculates the fastest route for each individual bus in the fleet and optimizes it almost on a real-time basis. In contrast to conventional navigation services, the quantum algorithm assigns each bus an individual route. This way, each bus can drive around traffic bottlenecks along the route at an early stage and avoid traffic jams before they even arise.

The experts from Volkswagen expect this development to have a further positive effect. As the buses travel along individually optimized routes which are calculated to ensure that they can never cause congestion themselves, there will be a general improvement in traffic flow within the city.

Volkswagen intends to develop the system to market maturity

In the future, Volkswagen plans to develop its traffic optimization system to market maturity. For this reason, the Volkswagen developers have designed the system so that it can generally be applied to any city and to vehicle fleets of any size. Further pilot projects for cities in Germany and other European countries are already being considered. Volkswagen believes that such a traffic optimization system could be offered to public transport companies, taxi companies or fleet operators.

Volkswagen and quantum computing

Volkswagen is cooperating with its technology partners D-Wave and Google, who provide the experts with access to their computer systems. In 2016, the Volkswagen team already successfully demonstrated congestion-free route optimization for taxis in the Chinese capital Beijing. Since then, the development of the algorithm has been steadily continued and it has been protected by patents in the USA.

The rest is here:

Volkswagen : optimizing traffic flow with quantum computers - Quantaneo, the Quantum Computing Source

IBM picked a fight with Google over its claims of ‘quantum supremacy.’ Here’s why experts say the feud could shake up the tech industry’s balance of…

Most people probably couldn't tell you what quantum computing is. And, as we learned last week from an unusual public spat between tech companies, it turns out that the top quantum-computing engineers aren't so sure either.

It all started when Google researchers published a paper in the journal Nature declaring that they achieved "quantum supremacy" a breakthrough in computing speed so radical that, to use a fictional analogy, it might be akin to attaining hyperspace travel speed.

But before the champagne had even been poured, IBM was disputing Google's claims with a blog post, insisting that, technically,"quantum supremacy" hadn't really been reached.

Quantum computers have special properties that allow them to solve problems exponentially faster than even the most powerful computers today. Google researchers said their quantum computer solved a problem in 200 seconds that would take a powerful supercomputer 10,000 years to solve a potential game changer for fighting climate change, discovering drugs, predicting the stock market, and cracking the toughest encryption.

Quantum computing is still in its infant stages, and you won't find it in your office anytime soon, but investors and researchers see huge potential in it. Already, companies like Google, IBM, Microsoft, and Intel are racing tobuild quantum computers, while venture capitalists are pouring money into startups like IonQ, Rigetti Computing, Aliro, and D-Wave.

The feud between IBM and Google is in many ways academic. But it also highlights the prominence and importance within the industry of a technology considered science fiction just a decade ago. As computing technology gets pushed to its limits, new technology like quantum computing has the potential to open entirely new markets and shake up the balance of powers in the tech industry.

And while Google and IBM are taking different approaches to quantum, the rival claims underscore the seriousness with which each company views the technology.

"Google is doing things as a research project," Brian Hopkins, the vice president and principal analyst at Forrester, told Business Insider. "IBM has a commercial strategy, pouring money in to get money out. They want to get to a point where quantum computers are powerful enough so people are willing to pay money to solve problems."

At the same time, rivals like Microsoft, Intel, and quantum-computing startups are lauding Google's experiment and see it as a good sign for quantum computing.

Jim Clarke, Intel's director of quantum hardware, with one of the company's quantum processors. Intel

"We're beginning to have a discussion that a quantum computer can do something that a supercomputer does not," Jim Clarke, the director of quantum hardware at Intel, told Business Insider. "It motivates us that we're on the right path. There's still a long way to go to get to a useful quantum computer. I think this is a positive step along the way."

Computer experts told Business Insider it would take time to prove whether Google did, in fact, reach this benchmark and whether IBM's disputes were correct.

IBM, which built Summit, the most powerful supercomputer, said the experiment could be run by a supercomputer in 2 1/2 days, as opposed to the 10,000 years Google said would be required with a traditional computing technology.

In other words, even though Google's quantum computer is faster, if it were true that the supercomputer could run that same problem in 2 1/2 days, it would not be that large of a difference. Running a problem that takes 10,000 years to solve is impractical, but if it took 2 1/2 days to solve, it would not be that big of a deal.

"The conflict between Google and IBM highlights that there's some ambiguity in the definition of quantum supremacy," Bill Fefferman, an assistant professor of computer science at the University of Chicago, told Business Insider.

Still, Google's work shows the progress of quantum computing, and people shouldn't lose sight of that, despite the arguments about it, Martin Reynolds, the distinguished vice president at Gartner, said.

That being said, since quantum computing is still in its early days, Google's milestone is "a bit like being the record holder in the 3-yard sprint," Reynolds said.

Fefferman added that the "jury is still out" on whether Google has actually reached quantum supremacy, but not because of anything IBM said.

"While it's not completely clear to me that there's currently enough evidence to conclude that we've reached quantum supremacy, Google is certainly breaking new ground and going places people have not gone before," Fefferman said.

And though Google's experiment is a "major scientific breakthrough," it has little influence on commercial users today, Matthew Brisse, the research vice president at Gartner, said.

"It demonstrates progress in the quantum community, but from an end-user perspective, it doesn't change anyone's plans or anyone's project initiatives because we're still many years away," Brisse told Business Insider. "We're literally five to 10 years away from using this in a commercial production environment."

In general, IBM and Google's competitors told Business Insider they saw the experiment as a step forward.

"This is an exciting scientific achievement for the quantum industry and another step on a long journey towards a scalable, viable quantum future," a Microsoft spokesperson said in a statement.

Rigetti Computing CEO Chad Rigetti. YouTube/Y Combinator

Chad Rigetti, the founder and CEO of the startup Rigetti Quantum Computing, called Google's experiment a "remarkable achievement" that should give researchers, policymakers, investors, and other users more confidence in quantum computing.

He added that IBM's claims haven't been tested on actual hardware yet, and even if it were proved, it would still be slower and more expensive to run than on Google's quantum computer.

"The Google experiment is a landmark scientific achievement and the most important milestone to date in quantum computing," Rigetti told Business Insider. "It shows that real commercial applications are now within sight for superconducting qubit systems."

Clarke, of Intel, agreed that it was a positive for the quantum community overall, though he said that calling it "quantum supremacy" might be debatable. Clarke also said that it could show that quantum computers could be more efficient, as he suspects that Google's quantum computer uses much less power than running a Summit supercomputer for over two days.

"What's been interesting to me is seeing some of the negative reactions to this announcement," Clarke told Business Insider. "If you're in the quantum community, any good experiment that suggests there's a long future in quantum computing should be appreciated. I haven't quite understood some of the negative response at this point."

What happens next is that other scientists will review the paper, work to prove or disprove it, and debate whether quantum supremacy has been reached. Ines Montano, an associate professor of applied physics at Northern Arizona University, said IBM would likely work to prove that its supercomputer could run that experiment in a shorter time frame.

"IBM will have to figure out something to put some data to their claim," Montano told Business Insider. "That will be a very public discussion for a while. In the meantime, there's the quest is to find problems that may be more applicable to current things ... We're not as far away as we were thinking 10 years ago."

This will likely take some time, as quantum supremacy is difficult to prove. Still, quantum computing is still in its early stages, experts say, and they expect more advancements in the coming years. Experts predict that the industry is still at least 10 years away from useful quantum computers.

"Google's managed to find a complex problem that they can solve on this system," Reynolds told Business Insider. "It isn't a useful solution, but it is a big step forwards. IBM offers a way to solve the problem with classical hardware in a couple of days. That's also impressive and shows the caliber of thinking that we find in these early quantum programs."

View post:

IBM picked a fight with Google over its claims of 'quantum supremacy.' Here's why experts say the feud could shake up the tech industry's balance of...

What Are the Biggest Challenges Technology Must Overcome in the Next 10 Years? – Gizmodo

Technologys fineI definitely like texting, and some of the shows on Netflix are tolerablebut the fields got some serious kinks to work out. Some of these are hardware-related: when, for instance, will quantum computing become practical? Others are of more immediate concern. Is there some way to stop latently homicidal weirdos from getting radicalized online? Can social networks be tweaked in such a way as to not nearly guarantee the outbreak of the second Civil War? As AI advances and proliferates, how can we stop it from perpetuating, or worsening, injustice and discrimination?

For this weeks Giz Asks, weve assembled a wide-ranging panelof futurists, engineers, anthropologists, and experts in privacy and AIto address these and many other hurdles.

Professor of Electrical Engineering and Computer Science and Director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT

Here are some broad societal impact challenges for AI. There are so many important and exciting challenges in front of upI include a few I have been thinking about:

1) virtual 1-1 student-teacher ratios for all childrenthis will enable personalized education and growth for all children

2) individualized healthcarethis will deliver medical attention to patients that is customized to their own bodies

3) reversing climate changethis will take us beyond mapping climate change into identifying ways to repair the damage; one example is to reverse engineer photosynthesis and incorporate such processes into smart cities to ameliorate pollution

4) interspecies communicationthis will enable us to understand and communicate with members of other species, for example to understand what whales are communicating through their song, etc

5) intelligent clothing that will monitor our bodies (1) to ensure we live well and (2) to detect the emergence of a disease before the disease happens

And here are some technical challenges:

1) interpretability and explainability of machine learning systems

2) robustness of machine learning systems

3) learning from small data

4) symbolic decision making with provable guarantees

5) generalizability

7) machine learning with provable guarantees

8) unsupervised machine learning

9) new models of machine learning that are closer to nature

Anthropologist and Research Director at the Centre National de la Recherche Scientifique, Institut Jean Nicod, Paris; Co-Founder of the Centre for the Resolution of Intractable Conflict, University of Oxford, and author of Talking to the Enemy: Faith, Brotherhood and the (Un)Making of Terrorists

How to tell the difference between real vs fake, and between good vs harmful so that we can prevent harmful fake (malign) activity and promote what is real and good?

Malign social media ecologies (hate speech, disinformation, polarizing and radicalizing campaigns, etc.) have both bottom-up and top-down aspects, each of which is difficult to deal but together stump most counter efforts. These problems are severely compounded by exploitation of cognitive biases (e.g., their tendency to believe in messages that conform to ones prior believes and to disbelieve messages that dont), and also by exploitation of cultural belief systems (e.g., gaining trust, as in the West, based on accuracy, objectivity, validation and competence vs. gaining trust, as in most of the rest of the world, based on respect, recognition, honor, and dignity) and preferences (e.g., values associated with family, communitarian, nationalist, traditional mores vs. universal, multicultural, consensual, progressive values).

Malign campaigns exploit psychological biases and political vulnerabilities in the socio-cultural landscape of nations, and among transnational and substate actors, which has already led to new ways of resisting, reinforcing and remaking political authority and alliances. Such campaigns also can be powerful force multipliers for kinetic warfare and affect economies. Although pioneered by state actors, disinformation tools are now readily available to anyone or any group with internet access to deploy at low cost. This democratization of influence operations, coupled with democracies vulnerabilities owing to political tolerance and free speech, requires our societies to create new forms of resilience as well as deterrence. This means that a significant portion of malign campaigns involve self-organizing bottom-up phenomena that self-repair. Policing and banning on any single platform (Twitter, Facebook, Instagram, VKontake, etc.) can be downright counterproductive, with users going to back doors even being banned, jumping between countries, continents and languages, and eventually producing global dark pools, in which illicit and malign online behaviors will flourish.

Because large clusters that carry hate speech or disinformation arise from small, organic clusters, it follows that large clusters can hence be reduced by first banning small clusters. In addition, random banning of a small fraction of the entire user population (say, 10 percent) would serve the dual role of lowering the risk of banning many from the same cluster, and inciting a large crowd. But if, indeed, States and criminal organizations with deep offline presence can create small clusters almost at will, then the problem becomes not one of simply banning small clusters or a small fraction of randomly chosen individuals. Rather, the key involves identifying small clusters that initiate a viral cascade propagating hate or malign influence. Information cascades follow a heavy-tailed distribution, with large-scale information cascades relatively rare (only 2 percent > 100 re-shares), with 50 percent of shares in a cascade occurring within an hour; so the problem is to find an appropriate strategy i to identify an incipient malign viral cascade and apply counter measures well within the first hour

There is also a layering strategy evident in State-sponsored and criminally-organized illicit online networks. Layering is a technique where links to disinformation sources are embedded in popular blogs, forums and websites of activists (e.g., environment, guns, healthcare, immigration, etc.) and enthusiasts (e.g., automobiles, music, sports, food and drink, etc.). These layering-networks, masquerading as alternative news and media sources, regularly seek bitcoin donations. Their block chains show contributions made by anonymous donors in orders of tens of thousands of dollars at a time, and hundreds of thousands of dollars over time. We find that these layering-networks often form clusters linking to the same Google Ad accounts, earning advertising dollars for their owners and operators. Social media and advertising companies often have difficulty identifying account owners linked with illicit and malign activity, in part because they often appear to be organic and regularly pass messages containing a kernel of truth. How, then, to detect layering-networks (Breitbart, One America News Network, etc.), symbols (logos, flags), faces (politicians, leaders), suspicious objects (weapons), hate speech and anti-democracy framing & as suspicious?

Finally, knowledge of psychology and cultural belief systems are needed to train the data that technology uses to mine, monitor, and manipulate information. Overcoming malign social media campaigns ultimately relies on human appraisal of strategic aspects, such as importance of core values and the stakes at play (political, social, economic), and relative strengths of players in those stakes. The critical role of social science goes beyond the expertise of engineers, analysts, and data scientists that platforms like Twitter, Instagram, and Facebook use to moderate propaganda, disinformation, and hateful content.

Yet, an acute problem concerns overwhelming evidence from cognitive and social psychology and anthropology, that truth and evidenceno matter how logically consistent or factually correctdo not sway public opinion or popular allegiance as much as appeals to basic cognitive biases that confirm deep beliefs and core cultural values. Indeed, many so-called biases used in argument do not reflect sub-optimal or deficient reasoning but rather suggest their efficient (even optimal) use for persuasionan evolutionarily privileged form of reasoning to socially recruit others to ones circle of beliefs for cooperation and mutual defense. Thus, to combat false or faulty reasoningas in noxious messagingits not enough to target an arguments empirical and logical deficiencies versus a counterarguments logical and empirical coherence. Moreover, recent evidence suggests that warning about misinformation has little effect (e.g., despite advanced warning, yes voters are more likely than no voters to remember a fabricated scandal about a vote no campaign, and no voters are more likely to remember a fabricated scandal about a vote yes campaign). Evidence is also mounting that value-driven, morally focused information in general, and social media in particular not only drives readiness to believe, but also concerted actions for beliefs.

One counter strategy involves compromising ones own truth and honesty, and ultimately moral legitimacy, in a disinformation arms race. Another is to remain true to the democratic values upon which our society is based (in principle if not practice), never denying or contradicting them, or threatening to impose them on others.

But how to consistently expose misleading, false, and malicious information while advancing truthful, evidence-based information that never contradicts our core values or threatens the core values of others (to the extent tolerable)? How to encourage people to exit echo chambers of the like-minded to engage in a free and open public deliberation on ideas that challenge preconceived or fed attitudes, a broader awareness of what is on offer and susceptibility to alternatives may be gained however initially strong ones preconception or fed history?

Professor, Mechanical Engineering, MIT, whose research focuses on quantum information and control theory

The two greatest technological challenges of our current time are

(a) good cellphone service, and

(b) a battery with the energy density of extra virgin olive oil

I need say no more about (a). For (b) I could have used diesel fuel instead of olive oil (they have similar energy densities), but I like the thought of giving my computer a squirt of extra virgin olive oil every time it runs out of juice.

Since you are also interested in quantum computing Ill comment there too.

Quantum computing is at a particularly exciting and maybe scary moment. If we can build large-scale quantum computers, they would be highly useful for a variety of problems, from code-breaking (Shors algorithm), to drug discovery (quantum simulation), to machine learning (quantum computers could find patterns in data that cant be found by classical computers).

Over the past two decades, quantum computers have progressed from relatively feeble devices capable of performing a few hundred quantum logic operations on a few quantum bits, to devices with hundreds or thousands of qubits capable of performing thousands to tens of thousands of quantum ops.

That is, we are just at the stage where quantum computers may actually be able to do something useful. Will they do it? Or will the whole project fail?

The primary technological challenge over the next few years is to get complex superconducting quantum circuits or extended quantum systems such as ion traps or quantum optical devices to the point where they can be sufficiently precisely controlled to perform computations that classical computers cant. Although there are technological challenges of fabrication and control involved, there are well-defined paths and strategies for overcoming those challenges. In the longer run, to build scalable quantum computers will require devices with hundreds of thousands of physical qubits, capable of implementing quantum error correcting codes.

Here the technological challenges are daunting, and in my opinion, we do not yet possess a clear path to overcoming them.

Quantitative futurist, Founder of the Future Today Institute, Professor of Strategic Foresight at New York University Stern School of Business, and the author, most recently, of The Big Nine: How the Tech Titans and Their Thinking Could Warp Humanity

The short answer is this: We continue to create new technologies without actively planning for their downstream implications. Again and again, we prioritize short-term solutions that simply never address long-term risk. We are nowists. Were not engaged in strategic thinking about the future.

The best example of our collective nowist culture can be seen in the development of artificial intelligence. Weve prioritized speed over safety, and longer-term strategy over short-term commercial gains. But were not asking important questions, like what happens to society when we transfer power to a system built by a small group of people that is designed to make decisions for everyone? The answer isnt as simple as it may seem, because we now rely on just a few companies to investigate, develop, produce, sell, and maintain the technology we use each and every day. There is tremendous pressure for these companies to build practical and commercial applications for AI as quickly as possible. Paradoxically, systems intended to augment our work and optimize our personal lives are learning to make decisions that we, ourselves, wouldnt. In other caseslike warehouses and logisticsAI systems are doing much of the cognitive work on their own and relegating the physical labor to human workers.

There are new regulatory frameworks for AI being developed by the governments of the US, Canada, EU, Japan, China, and elsewhere. Agencies like the U.S.-based National Institute of Standards and Technology are working on technical standards for AI, but that isnt being done in concert with similar agencies in other countries. Meanwhile, China is forging ahead with various AI initiatives and partnerships that are linking together emerging markets around the world into a formidable global network. Universities arent making fast, meaningful changes to their curricula to address ethics, values and bias throughout all of the courses in their AI programs. Everyday people arent developing the digital street smarts needed to confront this new era of technology. So they are tempted to download fun-looking, but ultimately suspicious apps. Theyre unwittingly training machine learning systems. Too often, they are outright tricked into allowing others to access untold amounts of their social, location, financial, and biometric data.

This is a systemic problem, one that involves our governments, financiers, universities, tech companies and even you, dear Gizmodo readers. We must actively work to create better futures. That will only happen through meaningful collaboration and a global coordination to shape AI in way that benefits companies and shareholders, but also prioritizes transparency, accountability and our personal data and privacy. The best way to engineer systematic change is to treat AI as a public good.

University Distinguished Professor, Chicago-Kent College of Law, Illinois Institute of Technology, whose work focuses on the impact of technologies on individuals, relationships, communities, and social institutions

Technologies from medicine to transportation to workplace tools are overwhelmingly designed by men and tested on men. Rather than being neutral, technologies developed with male-oriented specs can cause physical harm and financial risks to women. Pacemakers are unsuited to many women since womens hearts beat faster than mens and that was not figured into the design. Because only male crash test dummies were used in safety ratings until 2011, seat-belted women are 47% more likely to be seriously harmed in car accidents. When men and women visit help wanted websites, the technological algorithms direct men to higher-paying jobs. Machine learning algorithms designed to screen resumes so that companies can hire people like their current top workers erroneously discriminate against women when those current workers are men.

Womens hormones are different than mens, causing some drugs to have enhanced effects in women and some to have diminished effects. Even though 80% of medications are prescribed to women, drug research is still predominantly conducted on men. Between 1997 and 2000, the FDA pulled ten prescription drugs from the market, eight of which were recalled because of the health risks they posed to women.

On the other hand, some treatments may be beneficial to women, but never brought to market if the testing is done primarily on men. Lets say that a drug study enrolls 1000 people, 100 of whom are women. What if it offers no benefit to the 900 men, but all 100 women are cured? The researchers will abandon the drug, judging that it is only 10% effective. If a follow-up study focused on women, it could lead to a new drug to the benefit of women and the economy.

Workplace technologies also follow a male model. Female surgeons in even elite hospitals have to stack stools on top of one another to stand high enough to undertake laparoscopic surgeries. Their lesser hand strength causes them to have to use both hands to operate tools that male surgeons operate with one, leading female surgeons to have more back, neck and hand problems than men. Nonetheless, the patients of female surgeons do better than those of men. Imagine the health gain to the patients (and their female surgeons) if technologies were designed to accommodate women as well as men.

Female fighter pilots wear g-suits designed in the 1960s to fit men. These too-large suits do not provide adequate protection for women against g-forces, which can lead to a sudden loss of color vision or a full blackout as blood begins to rush from their brain. The zippers generally dont unzip far enough to comfortably fit the female bladder device, which causes some female pilots not to drink before missions, potentially leading to blackouts from dehydration. Other military equipment poses safety and efficacy risks to women. Designing with women in mindsuch as the current work on exoskeletonscan benefit both female and male soldiers by providing protection and increasing strength and endurance.

Id like to see the equivalent of a Moon Shota focused technology research programthat tackles the issue of women and technology. Innovation for and by women can grow the economy and create better products for everyone.

Do you have a question for Giz Asks? Email us at tipbox@gizmodo.com.

See the rest here:

What Are the Biggest Challenges Technology Must Overcome in the Next 10 Years? - Gizmodo

Quantum Computing: The Why and How – insideHPC

In this video from the Argonne Training Program on Extreme-Scale Computing 2019, Jonathan Baker from the University of Chicago presents: Quantum Computing: The Why and How.

The Argonne Training Program on Extreme-Scale Computing (ATPESC) provides intensive, two weeks of training on the key skills, approaches, and tools to design, implement, and execute computational science and engineering applications on current high-end computing systems and the leadership-class computing systems of the future. As a bridge to that future, this two-week program fills the gap that exists in the training computational scientists typically receive through formal education or other shorter courses. With around 70 participants accepted each year, admission to the ATPESC program is highly competitive. ATPESC is part of the Exascale Computing Project, a collaborative effort of the DOE Office of Science and the National Nuclear Security Administration.

Jonathan Baker is a second year Ph.D student at The University of Chicago advised by Fred Chong. He is studying quantum architectures, specifically how to map quantum algorithms more efficiently to near term devices. Additionally, he is interested in multivalued logic and taking advantage of quantum computings natural access to higher order states and using these states to make computation more efficient. Prior to beginning his Ph.D., he studied at the University of Notre Dame where he obtained a B.S. of Engineering in computer science and a B.S. in Chemistry and Mathematics.

Check out our insideHPC Events Calendar

The rest is here:

Quantum Computing: The Why and How - insideHPC

Editorial: Quantum computing is a competition we can’t afford to lose – The Winchester Star

We Americans have a habit of bragging about our feats of technology. Our chief economic and military rivals namely Russia and China seldom do. They prefer to keep their secrets.

No one in this country is certain, then, how far the state-controlled economies of those nations have gone in developing quantum computing.

What is certain is that our national security, both militarily and economically, demands that the United States be first to perfect the technology. The reason for that was demonstrated in an announcement Wednesday by technology giant Google.

Google officials claim to have achieved a breakthrough in quantum computing. They say they have developed an experimental quantum computing processor capable of completing a complex mathematical calculation in less than four minutes.

Google says it would take the most advanced conventional supercomputer in existence about 10,000 years to do that.

Wrap your mind around that, if you can.

Other companies working with quantum computing, including IBM, Intel and Microsoft, say Google is exaggerating. IBM researchers told The Associated Press the test calculation used by Google actually could be handled by certain supercomputers in two and one-half days.

Still, you get the idea: Quantum computing will give the nation including its armed forces and industries that gets there first an enormous advantage over everyone else. The possibilities, ranging from near-perfect missile defense systems to vastly accelerated research on curing diseases, are virtually endless.

U.S. officials are cognizant of the ramifications of quantum computing, to the point that Washington has allocated $1.2 billion to support research during the next five years.

If that is not enough to ensure the United States stays in the lead in the quantum computing race, more should be provided. This is a competition we cannot afford to lose.

Read the original:

Editorial: Quantum computing is a competition we can't afford to lose - The Winchester Star

Quantum investment soars in the UK to more than 1bn – Management Today

Whats very small but set to be very big? Quantum technology, according to the UK government, which took the decision in June to reinvest in a scheme designed to move the science beyond academia and research laboratories and into commercial and practical use.

Some 1bn has already been invested in the UKs National Quantum Technologies Programme, which was set up in 2013. The government recently announced a further 153m of funding through the Industrial Strategy Challenge Fund (which aims to ensure that 2.4 per cent of GDP is invested in R&D by 2027) plus 200m of investment from the private sector.

This means spending by industry is outstripping government investment for the first time, a good indication that the technology has stepped beyond an initial, broadly speculative stage. "Quantum is no longer an experimental science for the UK," says former science minister Chris Skidmore. "Investment by government and businesses is paying off as we become one of the worlds leading nations for quantum science and technologies."

Whereas "classical" computers are based on a structure of binary choices yes or no; on or off quantum computing is a lot more complicated. Classical chips rely on whether or not an electron is conducted from one atom to another around a circuit, but super-cooled quantum chips allow us to interface with the world at a much deeper level, taking into account properties such as superposition, entanglement or interference.

Confused? Think of a simple coin toss. Rather than being able to simply call heads or tails, superposition allows us to take into account when a coin spins, while entanglement is whether its properties are intrinsically linked with those of another coin.

To help harness this new potential in different areas, the governments programme works across four hubs: sensing and timing; imaging; computing and simulation; and communications.

One of the key advances that quantum computing is expected to bring is not just substantially greater processing speed but the ability to mimic and, therefore, understand and predict the ways that nature works.

For example, this could allow us to look directly inside the human body, see through smoke or mist, develop new drugs much more quickly and reliably by reviewing the effect on many molecules at the same time, or even make our traffic run smoothly. Meanwhile, the Met Office has already invested in this technology to improve weather forecasting.

Image: IBM Q System One quantum computer, photo by Misha Friedman/Getty Images

Read more from the original source:

Quantum investment soars in the UK to more than 1bn - Management Today

What one member of Trump’s new science advisory council wants it to tackle – Science Magazine

Dario Gil with IBMs System One quantum computer

By Jeffrey MervisNov. 1, 2019 , 2:10 PM

The Presidents Council of Advisors on Science and Technology (PCAST) has yet to hold its first meeting, and the White House hasnt even announced its full 16-person roster. But one newly appointed member, Director of IBM Research Dario Gil in Yorktown Heights, New York, already has a wish list of issues hed like it to tackle.

His list includes promoting scientific inquiry and its value to policymakers, ensuring that researchers have the computational tools they need in an era of big data, retraining the U.S. workforce to be more technically literate, and updating a partnership between the federal government, academia, and industry spelled out by Vannevar Bush at the end of World War II. Gil also thinksthe government must strike the right balance between protecting national security and fostering international scientific collaboration with the rest of the world, using a scalpel instead of a blanket policy to monitor and prevent undue foreign influences on U.S. research.

I am passionate about the need for continued investment in science, says Gil, 43, who joined IBM immediately after earning his Ph.D. in 2003 and has been rising quickly through its management ranks. I want to be an advocate of its critical importance.

That full-throated endorsement of the U.S. research enterprise may hearten some scientists who feel President Donald Trump and his administration have been scornful of their contributions and the role of science in policymaking. Gil declined to characterize his political leanings and deflected a question about whether he agrees with that criticism.

Henry Smith, a professor emeritus of electrical engineering at the Massachusetts Institute of Technology in Cambridge and Gils graduate adviser, thinks Trump has been a disaster for scienceand that Gil shares those views. However, Smith says Gil also understands that tact is important in trying to influence government policy.

I think that Dario will express his views as clearly and diplomatically as possible, says Smith, who has remained in touch with Gil. He knows that if he goes in and says something too radical, it would be ignored. But that doesnt mean hes not going to stand up for what he believes.

PCAST will be chaired by Kelvin Droegemeier, the presidents science adviser, who filled a 2-year vacancy when he became director of the White House Office of Science and Technology Policyin January. Gil spoke with ScienceInsider shortly after the White House announced PCASTs first cohort of seven scientists and industry leaders on 22 October. The interview has been edited for clarity and brevity.

Q: What haskept you at IBM?

A: Ive been having too much fun. When I graduated from [Smiths] nanostructures laboratory, I joined a group that was a natural extension of what I had been doing, pushing the limits of nanofabrication. But then I got involved in modeling, and in high-performance computing, and in applying models more broadly across different industries. And then I became responsible for all the core scientific research at IBM in the physical sciences. Then I led the AI [artificial intelligence] organization, and then I got really involved in quantum computing.

Q: What are some key issues in applying quantum science to real-world problems?

A:People are fascinated by the word quantum; theres something about the word itself and the underlying physics that fascinates people. Beyond that, they ask about the implications, and whether it will replace personal computing.

And I say, No, its not. Its bits, neurons, and qubits coming together.Its not about qubits replacing bits or taking over the world. So, lets not think about one replacing the other.

As for what it is good for, it will have a profound effect on how we discover new materials and how material science is practiced. In agriculture it could lead to a new generation of fertilizers with a totally different energy consumption to create them, or better batteries. And it has profound implications for encryption and cybersecurity, and about U.S. economic competitiveness. There are also implications for the workforce, and what we need to do to train a new generation of scientists working in this environment.

Q: Are you worried that government policies to protect research in the name of national security could go too far and hamper progress and international collaboration?

A: I think that balance in that equation is indispensable. Science itself is very open endeavor, and its very important to maintain an open attitude toward how we do basic science.

Now, as we start adding the words technology and products, it is reasonable to discuss, for each technology, the right balance among capability, national security, openness, and so on. That is not new. Weve been doing that as a country for many decades. And AI and quantum will be no different. Were going to have to find that balance. And we need multiple voices to get that right.

Casting a broad brush to include everything does not make for sophisticated policy. So, I am in favor of being more nuanced, and more precise, in each area, and approaching it with a scalpel, not with a blanket policy.

Q: Does the government need to do more to strengthen science, technology, engineering, and matheducation and improve diversity? And what have you seen that works?

A:There are two categories that we need to pay more attention to. One is formal education, and the other is once people join the workforce.

If you look at formal education, there have been some trends that have been really disturbing. For example, if you look at the percentage of women studying computer science, we are worse off than we were 35 years ago. So that is really sad, and we have to be able to reverse that trend.

And then once youve done with formal education and you enter the workforce, theres very little continuing education to cope with technological shifts. We need to pay more attention to the mechanism for invest in acquiring those new technological skills. How are we going to do it, and what are the incentives, and what is the role of the private sector and academia?

Go here to read the rest:

What one member of Trump's new science advisory council wants it to tackle - Science Magazine

Riding the Third Wave of AI without Quantum Computing – UC San Diego Health

Rapid changes are occurring in the field of artificial intelligence (AI) as many computer scientists explore new ways to make systems faster and more efficient. One anticipated capability is quantum computingtechnology that follows the laws of quantum physics, enabling processing power to exist in multiple states and perform multiple tasks at the same time. If realized in hardware, it would speed-up some computational problem-solving exponentially. UC San Diego Theoretical Physicist Max Di Ventra is catching this next wave of cutting-edge AI with an alternative and fundamentally different platform he calls memcomputing, which doesn't require quantum capabilities.

Sketch of a memcomputing architecture. Apart from the input/output and a control unit, which directs the machine on what problem to solve, all computation is done by a memory unit, a computational memory. From F.L. Traversa and M. Di Ventra, IEEE Trans. Neural Networks Learn. Sys. 26, 2702 (2015). 2015 IEEE.

Using a physics-based approach, this novel computing paradigm employs memory to both process and store information on the same physical location, a property that somewhat mimics the computational principles of the human brain, said the UC San Diego physics professor and author of The Scientific Method: Reflections from a Practitioner (Oxford University Press, 2018).

After years of trial and error, Di Ventra and his group developed all of the mathematics required for this new simple architecture, combining memory and compute anddriven by a specialized computational memory unit, with performance that resembles quantum computingwithout the overwhelming computational overhead. Now, with half-a-million dollars over 18 months from the Defense Advanced Research Projects Agency (DARPA), Di Ventra and his students are working to apply this new physics-based approach to AI.

Our project, if successful, would have a large impact in the field of machine learning and artificial intelligence by showing that physics approaches can be of great help in fields of research that are traditionally dominated by computer scientists, said Di Ventra.

With the DARPA funds, the team will apply memcomputing to the unsupervised learning, or pre-training, of Deep Belief Networks. These are systems of multi-layer neural networks (NNs) used to recognize, generate and group data. DiVentra will also propose a hardware architecture, using current technologies, to perform this task. Pre-training of NNs is a notoriously difficult problem, and researchers have all but abandoned it in favor of supervised learning. However, in order to have machines that adapt to external stimuli in real time and make decisions according to the context in which they operatethe goal of the third wave of AIpowerful new methods to train NNs in an unsupervised manner are required.

Demonstration that a memcomputing solver (named Falcon in the figure) outperforms, by orders of magnitude, state-of-the-art algorithms in solving difficult computational problems. From F. Sheldon, P. Cicotti, F.L. Traversa and M. Di Ventra, IEEE Trans. Neural Networks Learn. Sys. (2019). 2019 IEEE.

Di Ventra explained that memcomputing accelerates the time to find feasible solutions to the most complex optimization problems in all industries.

We have applied these emulations to a wide variety of difficult computational problems that are of interest to both academia and industry, and solved them orders of magnitude faster than traditional algorithms, noted Di Ventra.

Unlike quantum computing, memcomputing employs non-quantum units so it can be realized in hardware with available technology and emulated in software on traditional computers. Current computing capabilities began with the work of Alan Turing, who helped decrypt German codes during WWII with his Bombe Machine. He also developed the Turing Machine, which became the basis for modern computers. John von Neumann devised the architecture for the Turing Machine, whereby the central processing unit (CPU) was separate from the memory unit. The so-called von Neumann Bottleneck in todays computing is created precisely from the physical separation of the CPU and the memory unit: the CPU has to constantly insert and extract information from the memory, significantly slowing processing time.

Memcomputing represents a radical departure from both our traditional computers, and algorithms that run on them, and quantum computers, said Di Ventra. It provides the necessary tools for the realization of an adaptable computational platform deployable in the field of artificial intelligence and offers strategic advantages to the Department of Defense in numerous applications, said Di Ventra.

In view of the preliminary successes of memcomputing, Di Ventra has co-founded the company MemComputing, Inc., whichis developing a software as a service, based on this technology, to solve the most challenging problems in academia and industry.

UC San Diegos Studio Ten 300 offers radio and television connections for media interviews with our faculty. For more information, email .(JavaScript must be enabled to view this email address).

Read more from the original source:

Riding the Third Wave of AI without Quantum Computing - UC San Diego Health

Airbus announces the names of the jury members for its Quantum Computing Challenge – Quantaneo, the Quantum Computing Source

A jury of world-leading quantum computing experts is teaming up with Airbus to evaluate submitted proposals to the Airbus Quantum Computing Challenge. Challenge winners will be announced at the beginning of 2020.

Four decades ago, quantum computing was a little-known, obscure theory relating to classical representations of computational memory. Today, it is a red-hot topic in the tech world, as major digital players such as Intel, Google, IBM and Microsoft invest massive sums to push the technology forward. At the same time, academic centres of excellence have been popping up worldwide, demonstrating how ideas, talent and investment have been flowing from multiple directions.

At Airbus, quantum computing has been identified as a potential game-changing future technology for aerospace. Launched in 2019, the Airbus Quantum Computing Challenge aims to challenge experts and enthusiasts in the field to tackle complex aerospace computational problems.

To help evaluate the submitted proposals, Airbus is bringing together top-notch international quantum computing experts to serve as jury members. The experts reflect a diverse array of academics and industry professionals, from computer scientists to founders of start-ups. Each expert has significant and deep experience in quantum computing, and a level of expertise that is recognised on an international scale. The jury is tasked with assessing the submitted proposals to identify the winners of the first edition of the challenge.

The deadline for submissions is October 31, 2019.

Meet the jury of quantum computing experts

Harry Buhrman QuSoft / University of Amsterdam (Netherlands) Harry Buhrman is a computer scientist, and professor of algorithms, complexity theory and quantum computing at the University of Amsterdam. He is group leader of the Quantum Computing Group at the Centre for Mathematics and Informatics (CWI), as well as co-founder and executive director of QuSoft, a research centre dedicated to quantum software.

Wim van Dam QC Ware & University of California (Palo Alto, USA) Wim van Dam is a quantum computer scientist with expertise in developing and analysing quantum algorithms that significantly outperform classical algorithms. He is also a computer science and physics professor at the University of California, Santa Barbara. At QC Ware, he oversees the design and development of quantum algorithms for customer applications in optimisation and finance.

Joe Fitzsimons Horizons Quantum Computing (Singapore) Joe Fitzsimons is the founder and CEO of Horizon Quantum Computing, a venture-backed start-up focused on automatic synthesis of quantum algorithms. Prior to this role, he was a principal investigator at the Centre for Quantum Technologies at the National University of Singapore.

Elham Kashefi Sorbonne University & University of Edinburgh (Paris, France / Edinburgh, Scotland) Elham Kashefi is a research director at Sorbonne Universitys CNRS LIP6, and a quantum computing professor at the University of Edinburghs School of Informatics. She is also the associate director of the EPSRC Networked Quantum Information Technologies Hub and co-founder of the quantum tech start-up VeriQLoud.

Iordanis Kerenidis QC Ware / CNRS / Paris Centre for Quantum Computing (Palo Alto, USA / Paris, France) Iordanis Kerenidis is a research director at CNRS and the Director of the Paris Centre for Quantum Computing. He focuses on designing quantum algorithms for machine learning and optimisation with provable speed-ups. At QC Ware, he works on overseeing prototype development and algorithmic design for customers.

Michele Mosca University of Waterloo (Canada) Michele Mosca is the co-founder of the University of Waterloos Institute for Quantum Computing and a founding member of the Perimeter Institute for Theoretical Physics. He co-founded evolutionQ Inc., a start-up that supports organisations as they evolve their quantum-vulnerable systems to quantum-safe ones.

Troy Lee University of Technology Sydney (Australia) Troy Lee is an associate professor at the University of Technology Sydneys Centre for Quantum Software and Information. His research focuses on quantum algorithms, the limitations of quantum computers and complexity theory.

Jingbo Wang University of Western Australia Jingbo Wang leads an active research group at the University of Western Australia (UWA) in the area of quantum simulation, quantum walks and quantum algorithm development. At UWA, she is also the Head of the Physics Department and chair of a cross-disciplinary research cluster named Quantum information, simulation and algorithms.

See the article here:

Airbus announces the names of the jury members for its Quantum Computing Challenge - Quantaneo, the Quantum Computing Source

IT sees the Emergence of Quantum Computing as a Looming Threat to Keeping Valuable Information Confidential – Quantaneo, the Quantum Computing Source

A new study from DigiCert, Inc., the world's leading provider of TLS/SSL, IoT and PKI solutions, reveals that 71 percent of global organizations see the emergence of quantum computers as a large threat to security. Most anticipate tangible quantum computer threats will begin arriving within three years. The survey was conducted by ReRez Research in August 2019, within 400 enterprise organizations in the U.S., Germany and Japan from across critical infrastructure industries.

Quantum Computing Threat is Real and Quickly Approaching

Quantum computing is on the minds of many and is impacting their current and future thinking. Slightly more than half (55 percent) of respondents say quantum computing is a "somewhat" to "extremely" large security threat today, with 71 percent saying it will be a "somewhat" to "extremely" large threat in the future. The median prediction for when PQC would be required to combat the security threat posed by quantum computers was 2022, which means the time needed to prepare for quantum threats is nearer than some analysts have predicted.

Top Challenges

With the threat so clearly felt, 83 percent of respondents say it is important for IT to learn about quantum-safe security practices. Following are the top three worries reported for implementing PQC:

High costs to battle and mitigate quantum threats Data stolen today is safe if encrypted, but quantum attacks will make this data vulnerable in the future Encryption on devices and applications embedded in products will be susceptible 95 percent of respondents reported they are discussing at least one tactic to prepare for quantum computing, but two in five see this is as a difficult challenge. The top challenges reported include:

Cost Lack of staff knowledge Worries that TLS vendors won't have upgraded certificates in time "It is encouraging to see that so many companies understand the risk and challenges that quantum computing poses to enterprise encryption," said Tim Hollebeek, Industry and Standards Technical Strategist at DigiCert. "With the excitement and potential of quantum technologies to impact our world, it's clear that security professionals are at least somewhat aware of the threats that quantum computers pose to encryption and security in the future. With so many engaged, but lacking good information about what to do and how to prepare, now is the time for companies to invest in strategies and solutions that will help them get ahead of the game and not get caught with their data exposed when the threats emerge."

Preparing for PQC

Enterprises are beginning to prepare for quantum computing, with a third reporting they have a PQC budget and another 56 percent working on establishing a PQC budget. In terms of specific activities, not surprisingly, "monitoring" was the top tactic currently employed by IT. Understanding their organization's level of crypto-agility came next. This reflects the understanding that when the time comes to make a switch to PQC certificates, enterprises need to be ready to make the switch quickly and efficiently.

Rounding out the top five current IT tactics were understanding the organization's current level of risk, building knowledge about PQC and developing TLS best practices.

Recommendations

The DigiCert 2019 Post Quantum Crypto Survey points to three best practices for companies ready to start planning their strategies for securing their organizations for the quantum future:

Know your risk and establish a quantum crypto maturity model. Understand the importance of crypto-agility in your organization and establish it as a core practice. Work with leading vendors to establish digital certificate best practices and ensure they are tracking PQC industry progress to help you stay ahead of the curve, including with their products and solutions. Change rarely happens quickly, so it's better not to wait, but to address your crypto-agility now. For more information and to get the full report:https://www.digicert.com/resources/industry-report/2019-Post-Quantum-Crypto-Survey.pdf

The rest is here:

IT sees the Emergence of Quantum Computing as a Looming Threat to Keeping Valuable Information Confidential - Quantaneo, the Quantum Computing Source

More wrong answers get quantum computers to find the right one – Futurity: Research News

Share this Article

You are free to share this article under the Attribution 4.0 International license.

In quantum computers, generating more errors in a given operation may help reveal the right answer, according to new research.

Unlike conventional computers, the processing in quantum-based machines is noisy, which produces error rates dramatically higher than those of silicon-based computers. So quantum operations repeat thousands of times to make the correct answer stand out statistically from all the wrong ones.

The idea here is to generate a diversity of errors so you are not seeing the same error again and again.

But running the same operation over and over again on the same qubit set may just generate the same incorrect answers that can appear statistically to be the correct answer. The solution, researchers report, is to repeat the operation on different qubit sets that have different error signaturesand therefore wont produce the same correlated errors.

The idea here is to generate a diversity of errors so you are not seeing the same error again and again, says Moinuddin Qureshi, a professor in the School of Electrical and Computer Engineering at Georgia Institute of Technology, who worked out the technique with his senior PhD student, Swamit Tannu.

Different qubits tend to have different error signatures. When you combine the results from diverse sets, the right answer appears even though each of them individually did not get the right answer, says Tannu.

Tannu compares the technique, known as Ensemble of Diverse Mappings (EDM), to the game show Who Wants to be a Millionaire. Contestants who arent sure of the answer to a multiple choice question can ask the studio audience for help.

Its not necessary that the majority of the people in the audience know the right answer, Qureshi says. If even 20% know it, you can identify it. If the answers go equally in the four buckets from the people who dont know, the right answer will get 40% and you can select it even if only a relatively small number of people get it right.

Experiments with an existing Noisy Intermediate Scale Quantum (NISQ) computer showed that EDM improves the inference quality by 2.3 times compared to state-of-the-art mapping algorithms. By combining the output probability distributions of the diverse ensemble, EDM amplifies the correct answer by suppressing the incorrect ones.

The EDM technique, Tannu admits, is counterintuitive. Qubits can be ranked according to their error rate on specific types of problems, and the most logical course of action might be to use the set thats most accurate. But even the best qubits produce errors, and those errors are likely to be the same when the operation is done thousands of times.

Choosing qubits with different error ratesand therefore different types of errorguards against that by ensuring that the one correct answer will rise above the diversity of errors.

The goal of the research is to create several different versions of the program, each of which can make a mistake, but they will not make identical mistakes, Tannu explains. As long as they make diverse mistakes, when you average things out, the mistakes get canceled out and the right answer emerges.

Qureshi compares the EDM technique to team-building techniques promoted by human resource consultants.

If you form a team of experts with identical backgrounds, all of them may have the same blind spot, he says, adding a human dimension. If you want to make a team resilient to blind spots, collect a group of people who have different blind spots. As a whole, the team will be guarded against specific blind spots.

Error rates in conventional silicon-based computers are practically negligible, about one in a thousand-trillion operations, but todays NISQ quantum computers produce an error in a mere 100 operations.

These are really early-stage machines in which the devices have a lot of error, Qureshi says. That will likely improve over time, but because we are dependent on matter that has extremely low energy and lacks stability, we will never get the reliability we have come to expect with silicon. Quantum states are inherently about a single particle, but with silicon you are packing a lot of molecules together and averaging their activity.

If the hardware is inherently unreliable, we have to write software to make the most of it, he says. We have to take the hardware characteristics into account to make these unique machines useful.

The notion of running a quantum operation thousands of times to get whats likely to be the right answer at first seems counterproductive. But quantum computing is so much faster than conventional computing that nobody would object to doing a few thousand duplicate runs.

The objective with quantum computers is not to take a current program and run it faster, Qureshi says. Using quantum, we can solve problems that are virtually impossible to solve with even the fastest supercomputers. With several hundred qubits, which is beyond the current state of the art, we could solve problems that would take a thousand years with the fastest supercomputer.

You dont mind doing the computation a few thousand times to get an answer like that, Qureshi adds.

The researchers will present their work at the 52nd Annual IEEE/ACM International Symposium on Microarchitecture. Microsoft supported the research.

Source: Georgia Tech

See the original post:

More wrong answers get quantum computers to find the right one - Futurity: Research News

What is quantum computing? The next era of computational evolution, explained – Digital Trends

When you first stumble across the term quantum computer, you might pass it off as some far-flung science fiction concept rather than a serious current news item.

But with the phrase being thrown around with increasing frequency, its understandable to wonder exactly what quantum computers are, and just as understandable to be at a loss as to where to dive in. Heres the rundown on what quantum computers are, why theres so much buzz around them, and what they might mean for you.

All computing relies on bits, the smallest unit of information that is encoded as an on state or an off state, more commonly referred to as a 1 or a 0, in some physical medium or another.

Most of the time, a bit takes the physical form of an electrical signal traveling over the circuits in the computers motherboard. By stringing multiple bits together, we can represent more complex and useful things like text, music, and more.

The two key differences between quantum bits and classical bits (from the computers we use today) are the physical form the bits take and, correspondingly, the nature of data encoded in them. The electrical bits of a classical computer can only exist in one state at a time, either 1 or 0.

Quantum bits (or qubits) are made of subatomic particles, namely individual photons or electrons. Because these subatomic particles conform more to the rules of quantum mechanics than classical mechanics, they exhibit the bizarre properties of quantum particles. The most salient of these properties for computer scientists is superposition. This is the idea that a particle can exist in multiple states simultaneously, at least until that state is measured and collapses into a single state. By harnessing this superposition property, computer scientists can make qubits encode a 1 and a 0 at the same time.

The other quantum mechanical quirk that makes quantum computers tick is entanglement, a linking of two quantum particles or, in this case, two qubits. When the two particles are entangled, the change in state of one particle will alter the state of its partner in a predictable way, which comes in handy when it comes time to get a quantum computer to calculate the answer to the problem you feed it.

A quantum computers qubits start in their 1-and-0 hybrid state as the computer initially starts crunching through a problem. When the solution is found, the qubits in superposition collapse to the correct orientation of stable 1s and 0s for returning the solution.

Aside from the fact that they are far beyond the reach of all but the most elite research teams (and will likely stay that way for a while), most of us dont have much use for quantum computers. They dont offer any real advantage over classical computers for the kinds of tasks we do most of the time.

However, even the most formidable classical supercomputers have a hard time cracking certain problems due to their inherent computational complexity. This is because some calculations can only be achieved by brute force, guessing until the answer is found. They end up with so many possible solutions that it would take thousands of years for all the worlds supercomputers combined to find the correct one.

The superposition property exhibited by qubits can allow supercomputers to cut this guessing time down precipitously. Classical computings laborious trial-and-error computations can only ever make one guess at a time, while the dual 1-and-0 state of a quantum computers qubits lets it make multiple guesses at the same time.

So, what kind of problems require all this time-consuming guesswork calculation? One example is simulating atomic structures, especially when they interact chemically with those of other atoms. With a quantum computer powering the atomic modeling, researchers in material science could create new compounds for use in engineering and manufacturing. Quantum computers are well suited to simulating similarly intricate systems like economic market forces, astrophysical dynamics, or genetic mutation patterns in organisms, to name only a few.

Amidst all these generally inoffensive applications of this emerging technology, though, there are also some uses of quantum computers that raise serious concerns. By far the most frequently cited harm is the potential for quantum computers to break some of the strongest encryption algorithms currently in use.

In the hands of an aggressive foreign government adversary, quantum computers could compromise a broad swath of otherwise secure internet traffic, leaving sensitive communications susceptible to widespread surveillance. Work is currently being undertaken to mature encryption ciphers based on calculations that are still hard for even quantum computers to do, but they are not all ready for prime-time, or widely adopted at present.

A little over a decade ago, actual fabrication of quantum computers was barely in its incipient stages. Starting in the 2010s, though, development of functioning prototype quantum computers took off. A number of companies have assembled working quantum computers as of a few years ago, with IBM going so far as to allow researchers and hobbyists to run their own programs on it via the cloud.

Despite the strides that companies like IBM have undoubtedly made to build functioning prototypes, quantum computers are still in their infancy. Currently, the quantum computers that research teams have constructed so far require a lot of overhead for executing error correction. For every qubit that actually performs a calculation, there are several dozen whose job it is to compensate for the ones mistake. The aggregate of all these qubits make what is called a logical qubit.

Long story short, industry and academic titans have gotten quantum computers to work, but they do so very inefficiently.

Fierce competition between quantum computer researchers is still raging, between big and small players alike. Among those who have working quantum computers are the traditionally dominant tech companies one would expect: IBM, Intel, Microsoft, and Google.

As exacting and costly of a venture as creating a quantum computer is, there are a surprising number of smaller companies and even startups that are rising to the challenge.

The comparatively lean D-Wave Systems has spurred many advances in the fieldand proved it was not out of contention by answering Googles momentous announcement with news of a huge deal with Los Alamos National Labs. Still, smaller competitors like Rigetti Computing are also in the running for establishing themselves as quantum computing innovators.

Depending on who you ask, youll get a different frontrunner for the most powerful quantum computer. Google certainly made its case recently with its achievement of quantum supremacy, a metric that itself Google more or less devised. Quantum supremacy is the point at which a quantum computer is first able to outperform a classical computer at some computation. Googles Sycamore prototype equipped with 54 qubits was able to break that barrier by zipping through a problem in just under three-and-a-half minutes that would take the mightiest classical supercomputer 10,000 years to churn through.

Not to be outdone, D-Wave boasts that the devices it will soon be supplying to Los Alamos weigh in at 5000 qubits apiece, although it should be noted that the quality of D-Waves qubits has been called into question before. IBM hasnt made the same kind of splash as Google and D-Wave in the last couple of years, but they shouldnt be counted out yet, either, especially considering their track record of slow and steady accomplishments.

Put simply, the race for the worlds most powerful quantum computer is as wide open as it ever was.

The short answer to this is not really, at least for the near-term future. Quantum computers require an immense volume of equipment, and finely tuned environments to operate. The leading architecture requires cooling to mere degrees above absolute zero, meaning they are nowhere near practical for ordinary consumers to ever own.

But as the explosion of cloud computing has proven, you dont need to own a specialized computer to harness its capabilities. As mentioned above, IBM is already offering daring technophiles the chance to run programs on a small subset of its Q System Ones qubits. In time, IBM and its competitors will likely sell compute time on more robust quantum computers for those interested in applying them to otherwise inscrutable problems.

But if you arent researching the kinds of exceptionally tricky problems that quantum computers aim to solve, you probably wont interact with them much. In fact, quantum computers are in some cases worse at the sort of tasks we use computers for every day, purely because quantum computers are so hyper-specialized. Unless you are an academic running the kind of modeling where quantum computing thrives, youll likely never get your hands on one, and never need to.

See the article here:

What is quantum computing? The next era of computational evolution, explained - Digital Trends

Quantum Computing beginning talks with clients on its quantum asset allocation application – Proactive Investors USA & Canada

() Vice President of Product Development Steve Reinhardt tells Proactive the Virginia-based company is beginning conversations with early users of its quantum asset allocation application, and is further refining the tool.

Reinhardt recently attended the Qubits North America Users Conference in Newport, Rhode Island to discuss innovations in quantum computing.

Add related topics to MyProactive

Create your account: sign up and get ahead on news and events

The Company is a publisher. You understand and agree that no content published on the Site constitutes a recommendation that any particular security, portfolio of securities, transaction, or investment strategy is...

In exchange for publishing services rendered by the Company on behalf of Quantum Computing Inc named herein, including the promotion by the Company of Quantum Computing Inc in any Content on the Site, the Company...

See the original post:

Quantum Computing beginning talks with clients on its quantum asset allocation application - Proactive Investors USA & Canada

Detecting Environmental ‘Noise’ That Can Damage The Quantum State of Qubits – In Compliance

Sampson Wilcox

Scientists from MIT and Dartmouth College have created a new type of tool that can detect specific characteristics of environmental noise known for its ability to destroy qubits. This invention could help provide researchers with greater insight into the microscopic noise mechanisms and how they impact the quantum state of qubits, which are a fundamental aspect when constructing quantum computers. By learning new ways to protect qubits, scientists hope to make further advances into the realm of quantum computing.

Qubits generally represent the two states which correspond to the classic binary bits, 0 or a 1. However, qubits also have the ability to maintain both states at the same time, which is called a quantum superposition. This state is required for quantum computers to operate at such fast speeds, performing complicated tasks quickly and easily. As such, it is important that qubits be protected so this state can be achieved and maintained.

Unfortunately, keeping qubits in this state is easier said than done. Their ability to maintain this specific state, referred to as quantum coherence, can easily be disrupted by specific noises. These noises can be the byproduct of heat, control electronics, and even impurities found in the very material the qubits come from. Wherever the noise comes from, it creates a very high risk of causing serious computing errors for researchers.

While scientists have constructed devices that measure these unwanted noises, theyve generally only been able to capture very basic noises from a big number of sources think of it as white noise. Because they are unable to identify specific disruptive patterns, researchers have been unable to determine how the qubit is impacted by any specific noise source or protect them from the damage these noise sources can cause.

Now, researchers have created a device capable of separating specific noises from the general background noise. After that, they relied on signal-processing techniques to reconstruct incredibly detailed facts and figures pertaining to those specific noise signals. These new reconstructions will provide researchers with the ability to craft far more realistic noise models, which would hopefully allow them to develop new ways to protect qubits.

As qubits are developed with fewer and fewer defects, the presence of specific noises instead of the background noises scientists have learned to contend with could increase. That makes this technology even more important to the ongoing development of quantum computing.

Like Loading...

More here:

Detecting Environmental 'Noise' That Can Damage The Quantum State of Qubits - In Compliance

Princeton announces initiative to propel innovations in quantum science and technology – Princeton University

Princeton University has announced the creation of the Princeton Quantum Initiative to foster research and training across the spectrum from fundamental quantum science to its application in areas such as computing, sensing and communications.

The new initiative builds on Princeton's world-renowned expertise in quantum science, the area of physics that describes behaviors at the scale of atoms and electrons. Quantum technologies have the potential to revolutionize areas ranging from secure data transmission to biomedical research, to the discovery of new materials.

Princeton has announced the creation of the Princeton Quantum Initiative, designed to foster research and train scientists and engineers in quantum science and its application in areas such as computing, sensing and communications. Clockwise from left: Research images from Princeton faculty members Julia Mikhailova, assistant professor of mechanical and aerospace engineering; Nathalie de Leon, assistant professor of electrical engineering; Andrew Houck, professor of engineering; Jason Petta, the Eugene Higgins Professor of Physics; Ali Yazdani, the Class of 1909 Professor of Physics; M. Zahid Hasan, the Eugene Higgins Professor of Physics; Jeffrey Thompson, assistant professor of electrical engineering; and Robert Cava, the Russell Wellman Moore Professor of Chemistry.

Images courtesy of the researchers

The inaugural director will be Andrew Houck, professor of electrical engineering and a pioneer in quantum computing technologies. The initiative will bring together over 30 faculty members from departments across campus in the sciences and engineering.

"This initiative enables the work of our extraordinary quantum faculty and their teams to grow research capabilities and attract talented minds at all levels to Princeton, so that they can discover new materials, design new algorithms, and explore the depths of the underlying science in an exciting environment of discovery and innovation," said Dean for Research Pablo Debenedetti, the Class of 1950 Professor in Engineering and Applied Science and professor of chemical and biological engineering.

"The potential benefits to society from quantum information science make this an essential endeavor for Princeton. The initiative will provide tremendous opportunities for Princeton students and postdoctoral researchers to make profound contributions to future technologies," said Deborah Prentice, University provost and the Alexander Stewart 1886 Professor of Psychology and Public Affairs.

The initiative comes at a time of national momentum for quantum sciences at the University, government and industry level. In 2018, the federal government established the National Quantum Initiative to energize research and training in quantum information science and technology. New technologies over the past decade have enabled companies including Google, IBM and others to build research-stage quantum computers.

The Princeton Quantum Initiative will enable new collaborations both across campus and with other universities and industry. Within the University, the initiative will include faculty in the departments of electrical engineering, physics, chemistry, computer science and mechanical and aerospace engineering.

"Princeton has world leaders at all layers of this technology, including foundational science, materials synthesis and characterization, quantum device platforms, computer architecture, algorithm design and computational complexity," said Houck. "We have an incredible collection of experts in their respective disciplines, and the Princeton Quantum Initiative gives us an entity which brings everyone together to accelerate the pace of discovery."

The Princeton Quantum Initiative will support research across a range of areas, including quantum computing using silicon spin qubits in the laboratory of Jason Petta, the Eugene Higgins Professor of Physics.

Image by Emily Edwards, University of Maryland

To support the future of quantum research, the initiative will train a new generation of quantum scientists and engineers through financial support for graduate students and postdoctoral researchers. Annually, Princeton will award two prestigious graduate student fellowships, each providing support for three years, as well as two postdoctoral fellowships for three-year terms, with fellows able to choose projects and faculty mentors.

For undergraduates, the initiative will build on Princeton's leadership in the development of courses whose target audience includes those with no prior quantum physics background. The initiative will help coordinate teaching efforts across departments, offer more cohesive and wide-ranging instruction in quantum science and engineering, and provide undergraduates with opportunities to work on faculty-led projects.

The research supported through the initiative will span areas from new materials science for quantum devices to quantum computer architecture, algorithm design and computational complexity.

Quantum science promises to deliver dramatic enhancements in information processing and communications. Computers built on quantum principles can solve problems that are impossible with today's machines, potentially leading to discoveries in fields such as chemistry, materials science, optimization and information security.

Sensors based on quantum approaches can probe materials and biological systems at the nanoscale with unprecedented precision and resolution. Such sensors could detect medical conditions or be used for quality control in manufacturing of sensitive electronic equipment.

Quantum communication systems can provide provably secure communication that cannot be hacked without detection. Quantum encryption could someday replace today's internet security algorithms to ensure privacy of data transmissions.

Princeton has a long history of contributing foundational discoveries in quantum science. Over the decades, Princeton researchers have made major contributions to quantum theory and trained graduate students that have become leading quantum scientists and technologists. More on the research expertise of Princeton's quantum scientists and engineers is available online.

See more here:

Princeton announces initiative to propel innovations in quantum science and technology - Princeton University

Moore’s Law Is Dying. This Brain-Inspired Analogue Chip Is a Glimpse of What’s Next – Singularity Hub

Dark silicon sounds like a magical artifact out of a fantasy novel. In reality, its one branch of a three-headed beast that foretells the end of advances in computation.

Okthat might be too dramatic. But the looming problems in silicon-based computer chips are very real. Although computational power has exploded exponentially in the past five decades, weve begun hitting some intractable limits in further growth, both in terms of physics and economics.

Moores Law is dying. And chipmakers around the globe are asking, now what?

One idea is to bet on quantum computers, which tap into the ultra-weird world of quantum mechanics. Rather than operating on binaries of 0s and 1s, qubits can simultaneously represent both states, with each having a different probability and thus much higher information density.

Another idea is to look inside our heads: the quantum realm isnt the only way to get past binary computation. Our brains also operate on probabilities, making them a tangible source of inspiration to overhaul the entire computational world.

This week, a team from Pennsylvania State University designed a 2D device that operates like neurons. Rather than processing yes or no, the Gaussian synapse thrives on probabilities. Similar to the brain, the analogue chip is far more energy-efficient and produces less heat than current silicon chips, making it an ideal candidate for scaling up systems.

In a proof-of-concept test, the team used a simulated chip to analyze EEG (electroencephalography) signals taken from either wakeful or sleeping people. Without extensive training, the chip was able to determine if the subject was sleeping.

Combined, these new developments can facilitate exascale computing and ultimately benefit scientific discovery, national security, energy security, economic security, infrastructure development, and advanced healthcare programs, the team concluded.

With new iPhones every year and increasingly sophisticated processors, it certainly doesnt feel like were pushing the limits of silicon-based computing. But according to lead study author Dr. Saptarshi Das, the ability to further scale traditional computation is dying in three different aspects: energy, size, and complexity.

Energy scaling helps ensure a practically constant computational power budget, explained Das. But it came to an end around 2005 because of hard limits in the silicon chips thermodynamic propertiessomething scientists dub the Boltzmann tyranny (gotta love these names!). Size scaling, which packs more transistors onto the same chip area, soon followed suit, ending in 2017 because quantum mechanics imposes limitations at the materials level of traditional chips.

The third, complexity scaling, is still hanging on but on the decline. Fundamentally, explained the team, this is because of the traditional von Neumann architecture that most modern computers use, which rely on digital, binary computation. In addition, current computers store logic and memory units separately and have to operate sequentially, which increases delay and energy consumption. As more transistors are jam-packed onto the same chip and multiple cores are linked together into processors, eventually the energy needs and cooling requirements will hit a wall.

This is the Dark Silicon era. Because too much heat is given out, a large amount of transistors on a single chip cant be powered up at once without causing heat damage. This limitation requires a portion of computing components on a chip to be kept powered offkept darkat any instant, which severely limits computational power. Tinkering with variables such as how to link up transistors may optimize efficacy, but ultimately its a band-aid, not a cure.

In contrast, the brain deploys billions of information processing units, neurons, which are connected via trillions of synapses in order to accomplish massively parallel, synchronous, coherent, and concurrent computation, the team said. Thats our roadmap ahead.

Although there are plenty of neuromorphic chipsdevices that mimic the structure or functionality of neurons and synapsesthe team took a slightly different approach. They focused on recreating a type of artificial neural network called a probabilistic neural network (PNN) in hardware form.

PNNs have been around since the 60s as software, and theyre often used for classification problems. The mathematical heart of PNNs differs from most of the deep learning models used today, but the structure is relatively similar. A PNN generally has four layers, and raw data travels from the first layer to the last. The two middle layers, pattern and summation, process the data in a way that allows the last layer to make a voteit selects the best answer from a group of potential probable ones.

To implement PNNs directly in hardware form, the team engineered a Gaussian synapse made of two different materials: MoS2 and black phosphorus. Each represents a transistor, and is linked in series on a single synapse. The way the two transistors talk to each other isnt linear. When the MoS2 component switches on, the electrical current rises exponentially until it reaches a max level, then it drops. The connection strength is like a bell-shaped curveor in mathematical lingo, a Gaussian distribution widely used in probabilities (and where the device gets its name).

How each component turns on or off can be tweaked, which in turn controls communication between the transistors. This, in turn, mimics the inner workings of PNNs, said study author Amritanand Sebastian.

As a proof of concept, the team decided to give back to neuroscience. The brain generates electrical waves that can be picked up by electrodes on top of the scalp. Brain waves are terribly complicated data to process, said the team, and artificial neural networks running on traditional computers generally have a hard time sorting through them.

The team fed their Gaussian synapse recordings from 10 whole nights from 10 subjects, with 32 channels for each individual. The PNN rapidly recognized different brainwave components, and were especially good at picking out the frequencies commonly seen in sleep.

We dont need as extensive a training period or base of information for a probabilistic neural network as we need for an artificial neural network, said Das.

Thanks to quirks in the transistors materials, the chip had some enviable properties. For one, it was exceedingly low-power. To analyze 8 hours of EEG data, it consumed up to only 350 microwatts; to put this into perspective, the human brain generally runs on about 20 watts. This means that the Gaussian synapse facilitates energy scaling, explained Sebastian.

For another, the materials allow size scaling without losing their inherent electrical properties. Finally, the use of PNNs also solves the complexity scaling problem, because it can process non-linear decisions using fewer components than traditional artificial neural networks.

It doesnt mean that weve slayed the three-headed beast, at least not yet. But looking ahead, the team believes their results can further inspire more ultra-low power devices to tackle the future of computation.

Our experimental demonstration of Gaussian synapses uses only two transistors, which significantly improves the area and energy efficiency at the device level and provides cascading benefits at the circuit, architecture, and system levels. This will stimulate the much-needed interest in the hardware implementation of PNNs for a wide range of pattern classification problems, the authors concluded.

Image Credit: Photo byUmbertoonUnsplash

The rest is here:

Moore's Law Is Dying. This Brain-Inspired Analogue Chip Is a Glimpse of What's Next - Singularity Hub