Google’s top quantum computing brain may or may not have quit – Fudzilla

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Read more from the original source:

Google's top quantum computing brain may or may not have quit - Fudzilla

Quantum computing and blockchain, is this our bold future? – Irish Tech News

By Theodora Lau and Bradley Leimer, with some interesting musings on Quantum computing and blockchain

Everything that happens is connected to everything else.

There are then, moments in time, that act as trigger points for a series of interconnected events that result in significant human progress, whether through a new technology or a period of transformative societal change. This rejects both the conventional linear and teleological views of history those focusing on the procession toward the result rather than threaded causation of historical progression and looks for sparks of connected ingenuity that further develops the thrust of human advancement.

And so begins the heralded documentary series Connections created by science historian James Burke. Throughout the series, Burke demonstrates why we cannot view the development of any portion of our contemporary world in isolation. He asserts that advances in the modern world are the result of a series of interconnected events and moments of progress, whether that be an invention of necessity or a curious progression of culture from the seemingly disjointed motivations of humans, all of whom had no concept or perhaps little intention of the final result of their activities.

Human progress flies blind until everything becomes very transparent. This interaction of these isolated events drives our history, our innovation, our progress.

Evolution feels slow until a sudden series of tremors makes it all feel far too real.

This is how we often feel in our very modern world.

We are lost in the world of the dim light of glass, until we are awoken from our slumber of scrolling by something personally transformative to our lives.

The promise of technology is that it will improve our society, or at least make our lives more efficient, freeing up our time to pursue some of lifes pleasures, whether that be leisure like reading and writing and expressing ourselves through art, or toward more time working to solve lifes more pressing problems through the output of our work.

Certain technology especially recent improvements in computing, from faster processors, cloud storage, and advanced quantum computing combine with others to create opportunities to alleviate significant challenges like climate change, water scarcity, and global poverty. Others, like blockchain (distributed ledger technology), hold the promise of reigning in the issue around defining the source of truth within certain forms of data, some of which are life defining.

The creation of trust through technology is an interesting thread to pull. From the source of goods and services traveling through our supply chain to the authenticity of our elections, new technologies hold the potential to rapidly improve the future and the advancement of humanity. Closer to our focus on financial services, quantum computing addresses market risk, credit risk, digital annealing, dynamic portfolio selection, ATM replenishment and more. Blockchain technology has focused on AML/KYC, trade finance, remittance, central bank backed digital currency, security tokens, and has the capacity for continued innovation in the financial space.

What if these two elemental forces were viewed together? What if we channeled our inner James Burke, and looked for connections between these two transformative technologies? This is exactly what our partner Arunkumar Krishnakumar did in his new book Quantum Computing and Blockchain in Business: Exploring the applications, challenges and collision of quantum computing and blockchain. Though a seemingly impenetrable title, we can more than assure you its worth a read to understand where the future is headed.

Aruns book dissects the genesis of these twin technologies and how they intersect. Similar to how James Burke rejects the threading of historical events, the first time author writes about the impacts of these technologies on healthcare and pharmaceutical industries, governance, elections, smart cities, the environment, chemistry, logistics, and much more. We are left with the question of whether there is anything that a blockchain powered by quantum computing cannot do? Fortunately the book answers that as well.

As the book discusses in the last few chapters as viewed through Aruns critical lens there are also darker sides to these technologies where they could threaten nation states, launch a new cyber arms race he details the dangers of these technologies and how they might impact every life. He also concludes with some blue sky ideas both dreams and realized aspirations derived from the power of these complementary tools of knowledge and how writing this book provided him with a sense of hope for the future of humanity, in the age of rapidly developing and highly interdependent technologies.

Perhaps it is fitting then, that Arun uses a quote from the opening of the Charles Dickens novel, A Tale of Two Cities, to tell his story. The conflict between good and evil, between light and darkness, can be won. Technology is just another means to this end.

There is a lot of hype, but somewhere amid all the hype, there is still hope.

How we write the next chapter and the future of the human race is entirely up to us.

The sky is indeed blue.

We must never lose hope.

Listen in via iTunes and Spotify as Theo and Bradley of Unconventional Ventures have a conversation with our partner and co-host Arunkumar Krishnakumar, as he talks about his new book Quantum Computing and Blockchain in Business: Exploring the applications, challenges and collision of quantum computing and blockchain, and how he is finding solace in this summer of COVID-19. Listen to this, and every episode of One Vision, on your favorite player.

More about Irish Tech News and Business Showcase here

FYI the ROI for you is => Irish Tech News now gets over 1.5 million monthly views, and up to 900k monthly unique visitors, from over 160 countries. We have over 860,000 relevant followers on Twitter on our various accounts & were recently described as Irelands leading online tech news site and Irelands answer to TechCrunch, so we can offer you a good audience!

Since introducing desktop notifications a short time ago, which notify readers directly in their browser of new articles being published, over 30,000 people have now signed up to receive them ensuring they are instantly kept up to date on all our latest content. Desktop notifications offer a unique method of serving content directly to verified readers and bypass the issue of content getting lost in peoples crowded news feeds.

Drop us a line if you want to be featured, guest post, suggest a possible interview, or just let us know what you would like to see more of in our future articles. Were always open to new and interesting suggestions for informative and different articles. Contact us, by email, twitter or whatever social media works for you and hopefully we can share your story too and reach our global audience.

Irish Tech News

If you would like to have your company featured in the Irish Tech News Business Showcase, get in contact with us at [emailprotected] or on Twitter: @SimonCocking

Link:

Quantum computing and blockchain, is this our bold future? - Irish Tech News

Eleven Princeton faculty elected to American Academy of Arts and Sciences – Princeton University

Princeton faculty members Rubn Gallo, M. Zahid Hasan, Amaney Jamal, Ruby Lee, Margaret Martonosi, Tom Muir, Eve Ostriker, Alexander Smits, Leeat Yariv and Muhammad Qasim Zaman have been named members of the American Academy of Arts and Sciences. Visiting faculty member Alondra Nelson also was elected to the academy.

They are among 276 scholars, scientists, artists and leaders in the public, nonprofit and private sectors elected this year in recognition of their contributions to their respective fields.

Gallo is the Walter S. Carpenter, Jr., Professor in Language, Literature, and Civilization of Spain and a professor of Spanish and Portuguese. He joined the Princeton faculty in 2002. His most recent book is Conversacin en Princeton(2017)with Mario Vargas Llosa, who was teaching at Princeton when he received the Nobel Prize in Literature in 2010.

Gallos other books include Prousts LatinAmericans(2014);Freuds Mexico: Into the Wilds of Psychoanalysis(2010); Mexican Modernity: the Avant-Garde and the Technological Revolution(2005); New Tendencies in Mexican Art(2004); andThe Mexico City Reader(2004). He is currently working on Cuba: A New Era, a book about the changes in Cuban culture after the diplomatic thaw with the United States.

Gallo received the Gradiva award for the best book on a psychoanalytic theme and the Modern Language Associations Katherine Singer Kovacs Prize for the best book on a Latin American topic. He is a member of the board of the Sigmund Freud Museum in Vienna, where he also serves as research director.

Photo by

Nick Barberio, Office of Communications

Hasan is the Eugene Higgins Professor of Physics. He studiesfundamental quantum effects in exotic superconductors, topological insulators and quantum magnetsto make new discoveries about the nature of matter, work that may have future applications in areas such asquantum computing. He joined the faculty in 2002and has since led his research team to publish many influential findings.

Last year, Hasans lab led research that discovered that certain classes of crystals with an asymmetry like biological handedness, known as chiral crystals, may harbor electrons that behave in unexpected ways. In 2015, he led a research team that first observed Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers.

In 2013, Hasan was named a fellow of the American Physical Society for the experimental discovery of three-dimensional topological insulators a new kind of quantum matter. In 2009, he received a Sloan Research Fellowship for groundbreaking research.

Photo by Tori Repp/Fotobuddy

Jamal is the Edwards S. Sanford Professor of Politics and director of the Mamdouha S. Bobst Center for Peace and Justice. She has taught at Princeton since 2003. Her current research focuses on the drivers of political behavior in the Arab world, Muslim immigration to the U.S. and Europe, and the effect of inequality and poverty on political outcomes.

Jamal also directs the Workshop on Arab Political Development and the Bobst-AUB Collaborative Initiative. She is also principal investigator for the Arab Barometer project, which measures public opinion in the Arab world. She is the former President of the Association of Middle East Womens Studies.

Her books include Barriers to Democracy (2007), which won the 2008 APSA Best Book Award in comparative democratization, and Of Empires and Citizens, which was published by Princeton University Press (2012). She is co-editor of Race and Arab Americans Before and After 9/11: From Invisible Citizens to Visible Subjects (2007) and Citizenship and Crisis: Arab Detroit after 9/11 (2009).

Photo by Tori Repp/Fotobuddy

Lee is the Forrest G. Hamrick Professor in Engineering and professor of electrical engineering. She is an associated faculty member in computer science. Lee joined the Princeton faculty in 1998.Her work at Princeton explores how the security and performance of computing systems can be significantly and simultaneously improved by hardware architecture. Her designs of secure processor architectures have strongly influenced industry security offerings and also inspired new generations of academic researchers in hardware security, side-channel attacks and defenses, secure processors and caches, and enhanced cloud computing and smartphone security.

Her research lies at the intersection of computer architecture, cybersecurity and, more recently, the branch of artificial intelligence known as deep learning.

Lee spent 17 years designing computers at Hewlett-Packard, and was a chief architect there before coming to Princeton. Among many achievements, Lee is known in the computer industry for her design of the HP Precision Architecture (HPPA or PA-RISC) that powered HPs commercial and technical computer product families for several decades, and was widely regarded as introducing key forward-looking features. In the '90s she spearheaded the development of microprocessor instructions for accelerating multimedia, which enabled video and audio streaming, leading to ubiquitous digital media.Lee is a fellow into the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers.

Margaret Martonosi, the Hugh Trumbull Adams 35 Professor of Computer Science, specializes in computer architecture and mobile computing with an emphasis on power efficiency. She was one of the architects of the Wattch power modeling infrastructure, a tool that was among the first to allow computer scientists to incorporate power consumption into early-stage computer systems design. Her work helped demonstrate that power needs can help dictate the design of computing systems. More recently, Martonosis work has also focused on architecture and compiler issues in quantum computing.

She currently serves as head of the National Science Foundations Directorate for Computer and Information Science and Engineering, one of seven top-level divisions within the NSF. From 2017 until February 2020, she directed Princetons Keller Center for Innovation in Engineering Education, a center focused on enabling students across the University to realize their aspirations for addressing societal problems. She is an inventor who holds seven U.S. patents and has co-authored two technical reference books on power-aware computer architecture. In 2018, she was one of 13 co-authors of a National Academies consensus study report on progress and challenges in quantum computing.

Martonosi is a fellow of the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers IEEE). Among other honors, she has received a Jefferson Science Fellowship, the IEEE Technical Achievement Award, and the ACM SIGARCH Alan D. Berenbaum Distinguished Service Award. She joined the Princeton faculty in 1994.

Muir is the Van Zandt Williams, Jr. Class of 65 Professor of Chemistry and chair of the chemistry department. He joined Princeton in 2011 and is also an associated faculty member in molecular biology.

He leads research in investigating the physiochemical basis of protein function in complex systems of biomedical interest. By combining tools of organic chemistry, biochemistry, biophysics and cell biology, his lab has developed a suite of new technologies that provide fundamental insight into how proteins work. The chemistry-driven approaches pioneered by Muirs lab are now widely used by chemical biologists around the world.

Muir has published over 150 scientific articles and has won a number of honors for his research.He received a MERIT Award from the National Institutes of Health and is a fellow of American Association for the Advancement of Science and the Royal Society of Edinburgh.

Nelson is the Harold F. Linder Chair in the School of Social Science at the Institute for Advanced Study and a visiting lecturer with the rank of professor in sociology at Princeton. She is president of the Social Science Research Council and is one of the country's foremost thinkers in the fields of science, technology, social inequalityand race. Her groundbreaking books include "The Social Life of DNA: Race, Reparations, and Reconciliation after the Genome" (2016) and "Body and Soul: The Black Panther Party and the Fight Against Medical Discrimination" (2011).Her other books include"Genetics and the Unsettled Past: The Collision of DNA, Race, and History" (with Keith Wailoo of Princeton and Catherine Lee) and"Technicolor: Race, Technology, and Everyday Life" (with Thuy Linh Tu). In 2002 she edited "Afrofuturism," a special issue of Social Text.

Nelson's writings and commentary also have reached the broader public through a variety of outlets. She has contributed to national policy discussions on inequality and the implications of new technology on society.

She is an elected fellow of the American Academy of Political and Social Science, the Hastings Centerand the Sociological Research Association. She serves on several advisory boards, including the Andrew. W. Mellon Foundation and the American Association for the Advancement of Science.

Ostriker, professor of astrophysical sciences, studies the universe. Her research is in the area of theoretical and computational astrophysics, and the tools she uses are powerful supercomputers and algorithms capable of simulating the birth, life, death and reincarnation of stars in their galactic homes. Ostriker and her fellow researchers build computer models using fundamental physical laws ones that govern gravity, fluid dynamics and electromagnetic radiation to follow the evolution of conditions found in deep space.

Ostriker, who came to Princeton in 2012, and her team have explored the formation of superbubbles, giant fronts of hot gas that billow out from a cluster of supernova explosions. More recently, she and her colleagues turned their focus toward interstellar clouds.

The research team uses computing resources through the Princeton Institute for Computational Science and Engineering and its TIGER and Perseus research computing clusters, as well as supercomputers administered through NASA. In 2017, Ostriker received a Simons Investigator Award.

Photo by

Nick Donnoli, Office of Communications

Smits is the Eugene Higgins Professor of Mechanical and Aerospace Engineering, Emeritus. His research spans the field of fluid mechanics, including fundamental turbulence, supersonic and hypersonic flows, bio-inspired flows, sports aerodynamics, and novel energy-harvesting concepts.

He joined the Princeton faculty in 1981 and transferred to emeritus status in 2018. Smits served as chair of the Department of Mechanical and Aerospace Engineering for 13 years and was director of the Gas Dynamics Laboratory on the Forrestal Campus for 33 years. During that time, he received several teaching awards, including the Presidents Award for Distinguished Teaching.

Smits has written more than 240 articles and three books, and edited seven volumes. He was awarded seven patents and helped found three companies. He is a member of the National Academy of Engineering and a fellow of the American Physical Society, the American Institute of Aeronautics and Astronautics, the American Society of Mechanical Engineers, the American Association for the Advancement of Science, and the Australasian Fluid Mechanics Society.

Yariv is the Uwe Reinhardt Professor of Economics. An expert in applied theory and experimental economics, her research interests concentrate on game theory, political economy, psychology and economics. She joined the faculty in 2018. Yariv also is director of the Princeton Experimental Laboratory for the Social Sciences.

She is a member of several professional organizations and is lead editor of American Economic Journal: Microeconomics, a research associate with the Political Economy Program of the National Bureau of Economic Research, and a research fellow with the Industrial Organization Programme of the Centre for Economic Policy Research.

She is also a fellow of the Econometric Society and the Society for the Advancement of Economic Theory, and has received numerous grants for researchand awards for her many publications.

Zaman, who joined the Princeton faculty in 2006, is the Robert H. Niehaus 77 Professor of Near Eastern Studies and Religion and chair of the Department of Near Eastern Studies.

He has written on the relationship between religious and political institutions in medieval and modern Islam, on social and legal thought in the modern Muslim world, on institutions and traditions of learning in Islam, and on the flow of ideas between South Asia and the Arab Middle East. He is the author of Religion and Politics under the Early Abbasids (1997), The Ulama in Contemporary Islam: Custodians of Change (2002), Ashraf Ali Thanawi: Islam in Modern South Asia (2008), Modern Islamic Thought in a Radical Age: Religious Authority and Internal Criticism (2012), and Islam in Pakistan: A History (2018). With Robert W. Hefner, he is also the co-editor of Schooling Islam: The Culture and Politics of Modern Muslim Education (2007); with Roxanne L. Euben, of Princeton Readings in Islamist Thought (2009); and, as associate editor, with Gerhard Bowering et al., of the Princeton Encyclopedia of Islamic Political Thought (2013). Among his current projects is a book on South Asia and the wider Muslim world in the 18th and 19th centuries.

In 2017, Zaman received Princetons Graduate Mentoring Award. In 2009, he received a Guggenheim Fellowship.

The mission of the academy: Founded in 1780, the American Academy of Arts and Sciences honors excellence and convenes leaders from every field of human endeavor to examine new ideas, address issues of importance to the nation and the world, and work together to cultivate every art and science which may tend to advance the interest, honor, dignity, and happiness of a free, independent, and virtuous people.

View post:

Eleven Princeton faculty elected to American Academy of Arts and Sciences - Princeton University

The future of quantum computing in the cloud – TechTarget

AWS, Microsoft and other IaaS providers have jumped on the quantum computing bandwagon as they try to get ahead of the curve on this emerging technology.

Developers use quantum computing to encode problems as qubits, which compute multiple combinations of variables at once rather than exploring each possibility discretely. In theory, this could allow researchers to quickly solve problems involving different combinations of variables, such as breaking encryption keys, testing the properties of different chemical compounds or simulating different business models. Researchers have begun to demonstrate real-world examples of how these early quantum computers could be put to use.

However, this technology is still being developed, so experts caution that it could take more than a decade for quantum computing to deliver practical value. In the meantime, there are a few cloud services, such as Amazon Bracket and Microsoft Quantum, that aim to get developers up to speed on writing quantum applications.

Quantum computing in the cloud has the potential to disrupt industries in a similar way as other emerging technologies, such as AI and machine learning. But quantum computing is still being established in university classrooms and career paths, said Bob Sutor, vice president of IBM Quantum Ecosystem Development. Similarly, major cloud providers are focusing primarily on education at this early stage.

"The cloud services today are aimed at preparing the industry for the soon-to-arrive day when quantum computers will begin being useful," said Itamar Sivan, co-founder and CEO of Quantum Machines, an orchestration platform for quantum computing.

There's still much to iron out regarding quantum computing and the cloud, but the two technologies appear to be a logical fit, for now.

Cloud-based quantum computing is more difficult to pull off than AI, so the ramp up will be slower and the learning curve steeper, said Martin Reynolds, distinguished vice president of research at Gartner. For starters, quantum computers require highly specialized room conditions that are dramatically different from how cloud providers build and operate their existing data centers.

Reynolds believes practical quantum computers are at least a decade away. The biggest drawback lies in aligning the quantum state of qubits in the computer with a given problem, especially since quantumcomputersstill haven't been proven to solve problems better than traditional computers.

Coders also must learn new math and logic skills to utilize quantum computing. This makes it hard for them since they can't apply traditional digital programming techniques. IT teams need to develop specialized skills to understand how to apply quantum computing in the cloud so they can fine tune the algorithms, as well as the hardware, to make this technology work.

Current limitations aside, the cloud is an ideal way to consume quantum computing, because quantum computing has low I/O but deep computation, Reynolds said. Because cloud vendors have the technological resources and a large pool of users, they will inevitably be some of the first quantum-as-a-service providers and will look for ways to provide the best software development and deployment stacks.

Quantum computing could even supplement general compute and AI services cloud providers currently offer, said Tony Uttley, president of Honeywell Quantum Solutions.In that scenario, the cloud would integrate with classical computing cloud resources in a co-processing environment.

The cloud plays two key roles in quantum computing today, according to Hyoun Park, CEO and principal analyst at Amalgam Insights. The first is to provide an application development and test environment for developers to simulate the use of quantum computers through standard computing resources.

The second is to offer access to the few quantum computers that are currently available, in the way mainframe leasing was common a generation ago. This improves the financial viability of quantum computing, since multiple users can increase machine utilization.

It takes significant computing power to simulate quantum algorithm behavior from a development and testing perspective. For the most part, cloud vendors want to provide an environment to develop quantum algorithms before loading these quantum applications onto dedicated hardware from other providers, which can be quite expensive.

However, classical simulations of quantum algorithms that use large numbers of qubits are not practical. "The issue is that the size of the classical computer needed will grow exponentially with the number of qubits in the machine," said Doug Finke, publisher of the Quantum Computing Report.So, a classical simulation of a 50-qubit quantum computer would require a classical computer with roughly 1 petabyte of memory. This requirement will double with every additional qubit.

Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage. Martin ReynoldsDistinguished vice president of research at Gartner

But classical simulations for problems using a smaller number of qubits are useful both as a tool to teach quantum algorithms to students and also for quantum software engineers to test and debug algorithms with "toy models" for their problem, Finke said.Once they debug their software, they should be able to scale it up to solve larger problems on a real quantum computer.

In terms of putting quantum computing to use, organizations can currently use it to support last-mile optimization, encryption and other computationally challenging issues, Park said. This technology could also aid teams across logistics, cybersecurity, predictive equipment maintenance, weather predictions and more. Researchers can explore multiple combinations of variables in these kinds of problems simultaneously, whereas a traditional computer needs to compute each combination separately.

However, there are some drawbacks to quantum computing in the cloud. Developers should proceed cautiously when experimenting with applications that involve sensitive data, said Finke. To address this, many organizations prefer to install quantum hardware in their own facilities despite the operational hassles, Finke said.

Also, a machine may not be immediately available when a quantum developer wants to submit a job through quantum services on the public cloud. "The machines will have job queues and sometimes there may be several jobs ahead of you when you want to run your own job," Finke said. Some of the vendors have implemented a reservation capability so a user can book a quantum computer for a set time period to eliminate this problem.

IBM was first to market with its Quantum Experience offering, which launched in 2016 and now has over 15 quantum computers connected to the cloud. Over 210,000 registered users have executed more than 70 billion circuits through the IBM Cloud and published over 200 papers based on the system, according to IBM.

IBM also started the Qiskit open source quantum software development platform and has been building an open community around it. According to GitHub statistics, it is currently the leading quantum development environment.

In late 2019, AWS and Microsoft introduced quantum cloud services offered through partners.

Microsoft Quantum provides a quantum algorithm development environment, and from there users can transfer quantum algorithms to Honeywell, IonQ or Quantum Circuits Inc. hardware. Microsoft's Q# scripting offers a familiar Visual Studio experience for quantum problems, said Michael Morris, CEO of Topcoder, an on-demand digital talent platform.

Currently, this transfer involves the cloud providers installing a high-speed communication link from their data center to the quantum computer facilities, Finke said. This approach has many advantages from a logistics standpoint, because it makes things like maintenance, spare parts, calibration and physical infrastructure a lot easier.

Amazon Braket similarly provides a quantum development environment and, when generally available, will provide time-based pricing to access D-Wave, IonQ and Rigetti hardware. Amazon says it will add more hardware partners as well. Braket offers a variety of different hardware architecture options through a common high-level programming interface, so users can test out the machines from the various partners and determine which one would work best with their application, Finke said.

Google has done considerable core research on quantum computing in the cloud and is expected to launch a cloud computing service later this year. Google has been more focused on developing its in-house quantum computing capabilities and hardware rather than providing access to these tools to its cloud users, Park said. In the meantime, developers can test out quantum algorithms locally using Google's Circ programming environment for writing apps in Python.

In addition to the larger offerings from the major cloud providers, there are several alternative approaches to implementing quantum computers that are being provided through the cloud.

D-Wave is the furthest along, with a quantum annealer well-suited for many optimization problems. Other alternatives include QuTech, which is working on a cloud offering of its small quantum machine utilizing its spin qubits technology. Xanadu is another and is developing a quantum machine based on a photonic technology.

Researchers are pursuing a variety of approaches to quantum computing -- using electrons, ions or photons -- and it's not yet clear which approaches will pan out for practical applications first.

"Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage, where Edison reportedly tested thousands of ways to make a carbon filament until he got to one that lasted 1,500 hours," Reynolds said. In the meantime, recent cloud offerings promise to enable developers to start experimenting with these different approaches to get a taste of what's to come.

See the article here:

The future of quantum computing in the cloud - TechTarget

Physicists Successfully Use ‘Hot’ Qubits to Overcome a Huge Quantum Computing Problem – ScienceAlert

As quantum computers continue to grow in size and complexity, engineers are hitting a major obstacle. All of that added machinery means higher temperatures - and if anything can ruin a perfectly good quantum bit, it's heat.

There are a few possible solutions, but any fix needs to be small and compatible with existing silicon technology. Two recently published papers confirm a new device developed by engineers at Australia's University of New South Wales (UNSW) could be the way to go.

Early last year, the researchers tentatively announced tiny semiconducting materials called quantum dots could be isolated and still used to carry out the kinds of quantum operations needed for the next generation of computing, all at a relatively toasty 1.5 degrees Kelvin.

"This is still very cold, but is a temperature that can be achieved using just a few thousand dollars' worth of refrigeration, rather than the millions of dollars needed to cool chips to 0.1 Kelvin," says senior researcher Andrew Dzurak from UNSW.

That research has not only now been given the thumbs-up in a peer review, it's also been validated by a second, completely different study conducted by a team from Delft University of Technology in the Netherlands.

Having confirmation that this proof of concept device works as theorised should give us confidence that this technology, if not something like it, will be one way we'll scale up quantum computers to increasingly useful sizes.

Where conventional computing uses a binary system of 'bits' to perform logical operations, quantum computing uses the probabilistic nature of quantum states to manage particular calculations.

Those states are most easily represented in the features of tiny (preferably subatomic sized) particles. While in an unmeasured form, these particles can be described mathematically as possessing a blend of characteristics in what's known as a superposition.

The mathematics of superposition particles called qubits when used this way can make short work of algorithms that would take conventional computers far too long to solve, at least in theory.

But to really get the most out of them, qubits should work collaboratively with other qubits, entangling their mathematics in ever more complex ways. Ideally, dozens of qubits should work together if we're to make a quantum computer that's more than just an expensive toy.

Some tech companies claim to be at that point already. For them, the next step is to connect hundreds, if not millions together. It's a lofty goal that presents engineers with a growing problem.

"Every qubit pair added to the system increases the total heat generated," says Dzurak.

Heat risks making a mess of the whole superposition thing, which is why current designs rely so much on cooling technology that freezes particles to a virtual stand-still.

Just adding more heat sinks runs into space and efficiency problems. So Dzurak and his team looked for ways to house a qubit that could handle rising temperatures.

The trick, they found, was to isolate electrons from their reservoir on a pair of nanometre-sized islands called quantum dots, made from silicon metal-oxide.

The electron states can then be set and measured using a process called tunnelling, where the quantum uncertainty of each electron's position allows them to teleport between dots.

This tunnelling within an isolated qubit nest gives the delicate states of the electrons a level of protection against the slightly higher temperatures, while still allowing the system to link in with conventional electronic computers.

"Our new results open a path from experimental devices to affordable quantum computers for real world business and government applications," says Dzurak.

As a proof of concept, it's exciting stuff. But plenty of questions need to be answered before we'll see it marry with existing quantum computing technology.

Cooking qubits at temperatures 15 times warmer than usual seems to work just fine so far, but we're yet to see how this translates to entangled groups, and whether methods for correcting errors still work for a 'hot' qubit.

No doubt researchers will be turning their attention to these concerns in future experiments, moving us ever closer to quantum computers capable of cracking some of the hardest problems the Universe can throw at us.

This research was published in Nature here and here.

Read more from the original source:

Physicists Successfully Use 'Hot' Qubits to Overcome a Huge Quantum Computing Problem - ScienceAlert

Alex Garland on ‘Devs,’ free will and quantum computing – Engadget

Garland views Amaya as a typical Silicon Valley success story. In the world of Devs, it's the first company that manages to mass produce quantum computers, allowing them to corner that market. (Think of what happened to search engines after Google debuted.) Quantum computing has been positioned as a potentially revolutionary technology for things like healthcare and encryption, since it can tackle complex scenarios and data sets more effectively than traditional binary computers. Instead of just processing inputs one at a time, a quantum machine would theoretically be able to tackle an input in multiple states, or superpositions, at once.

By mastering this technology, Amaya unlocks a completely new view of reality: The world is a system that can be decoded and predicted. It proves to them that the world is deterministic. Our choices don't matter; we're all just moving along predetermined paths until the end of time. Garland is quick to point out that you don't need anything high-tech to start asking questions about determinism. Indeed, it's something that's been explored since Plato's allegory of the cave.

"What I did think, though, was that if a quantum computer was as good at modeling quantum reality as it might be, then it would be able to prove in a definitive way whether we lived in a deterministic state," Garland said. "[Proving that] would completely change the way we look at ourselves, the way we look at society, the way society functions, the way relationships unfold and develop. And it would change the world in some ways, but then it would restructure itself quickly."

The sheer difficulty of coming up with something -- anything -- that's truly spontaneous and isn't causally related to something else in the universe is the strongest argument in favor of determinism. And it's something Garland aligns with personally -- though that doesn't change how he perceives the world.

"Whether or not you or I have free will, both of us could identify lots of things that we care about," he said. "There are lots of things that we enjoy or don't enjoy. Or things that we're scared of, or we anticipate. And all of that remains. It's not remotely affected by whether we've got free will or not. What might be affected is, I think, our capacity to be forgiving in some respects. And so, certain kinds of anti-social or criminal behavior, you would start to think about in terms of rehabilitation, rather than punishment. Because then, in a way, there's no point punishing someone for something they didn't decide to do."

View post:

Alex Garland on 'Devs,' free will and quantum computing - Engadget

CIP partners with ISARA to offer crypto-agile technology to complement innovative Whitethorn platform – Quantaneo, the Quantum Computing Source

The onset of large-scale quantum computing will challenge the security of current public-key cryptography and create widespread vulnerabilities. The rigidity of todays infrastructure makes cryptographic migrations complex and costly. Establishing crypto agility in existing systems is the first step towards seamless migrations.

The strategic partnership allows CIP to offer industry leading quantum-safe, crypto agile and hybrid certificate offerings from ISARA. This ground-breaking technology enables systems to be quantum safe without disruption of operations while maintaining the availability and integrity of existing security systems.

The new agile certificates will be recognisable by CIPs Whitethorn Platform a digital certificate, key discovery, and lifecycle management solution that provides unrivalled discovery, management and automation.

Andy Jenkinson, Group CEO CIP, said: Quantum computing is the next major development within the global technology area. The biggest challenge to cyber security is the lack of understanding of cryptography and PKI in todays classical computing, let alone in a post-quantum world. The partnership with ISARA will enable all our clients to realise full discovery, management and automation of their crypto-agile PKI.

Scott Totzke, CEO & Co-founder of ISARA, said: We are excited to partner with CIP to ensure their clients migration to quantum-safe cryptography starts with integrating crypto-agility, an essential first step towards cryptographic resilience and long-term security. This is some welcome good news in these turbulent times.

Read the original:

CIP partners with ISARA to offer crypto-agile technology to complement innovative Whitethorn platform - Quantaneo, the Quantum Computing Source

Science of Star Trek – The UCSB Current

In the most recent episode of his YouTube series Science vs. Cinema, UC Santa Barbara physicist Andy Howell takes on Star Trek: Picard, exploring how the CBS offerings presentation of supernovae and quantum computing stack up against real world science.

For Howell, the series that reviews the scientific accuracy and portrayal of scientists in Hollywoods top sci-fi films is as much an excuse to dive into exciting scientific concepts and cutting edge research.

Science fiction writers are fond of grappling with deep philosophical questions, he said. I was really excited to see that UCSB researchers were thinking about some of the same things in a more grounded way.

For the Star Trek episode, Howell spoke with series creators Alex Kurtzman and Michael Chabon, as well as a number of cast members, including Patrick Stewart. Joining him to discuss quantum science and consciousness were John Martinis a quantum expert at UC Santa Barbara and chief scientist of the Google quantum computing hardware group and fellow UCSB Physics professor Matthew Fisher. Fishers group is studying whether quantum mechanics plays a role in the brain, a topic taken up in the new Star Trek series.

Howell also talked supernovae and viticulture with friend and colleague Brian Schmidt, vice- chancellor of the Australian National University. Schmidt won the 2011 Nobel Prize in Physics for helping to discover that the expansion of the universe is accelerating.

"We started Science vs. Cinema to use movies as a jumping-off point to talk science Howell said. Star Trek Picard seemed like the perfect fit. Star Trek has a huge cultural impact and was even one of the things that made me want to study astronomy.

Previous episodes of Science vs. Cinema have separated fact from fiction in films such as Star Wars, The Current War, Ad Astra, Arrival and The Martian. The success of prior episodes enabled Howell to get early access to the show and interview the cast and crew.

"What most people think about scientific subjects probably isn't what they learned in a university class, but what they saw in a movie, Howell remarked. That makes movies an ideal springboard for introducing scientific concepts. And while I can only reach dozens of students at a time in a classroom, I can reach millions on TV or the internet.

View original post here:

Science of Star Trek - The UCSB Current

Pentagon wants commercial, space-based quantum sensors within 2 years – The Sociable

The Pentagons Defense Innovation Unit is looking to the private sector to develop space-based quantum sensing prototypes within two years the kind of sensors that could contribute to a space-based quantum internet.

Highlights:

Quantum technologies will render all previously existing stealth, encryption, and communications technologies obsolete, so naturally the Pentagon wants to develop quantum technologies as a matter of national security.

The Defense Innovation Unit (DIU) has opened a solicitation to evaluate commercial solutions that utilize demonstrable quantum technology to achieve significant performance improvements for aerospace and other novel applications to include, but not limited to, inertial sensing, timing and gravimetry.

The DIU wants a prototype within 24 months that consists of acompact, high-performance quantum sensor for precision inertial measurement in deep space and other GPS-denied environments.

There are a lot of technical concepts that go into this technology, but for simplicitys sake, the DIU is looking for quantum sensing technology that can perform accurate measurements by overcoming the effects of gravity on time and space.

While the DIU did not go into any specifics about what the quantum sensing technology would actually be used for, we may gleam some ideas from what the military has already been researching specifically improved communications, precision navigation, and precision timing.

For example, the Air Force Research Laboratory has been investigating a variety of quantum-based sensors to create secure, jam-resistant alternatives to GPS, according to National Defense Magazine.

And because quantum sensors can detect radar signatures and beyond, they may be used by the military tobypass just about any stealth technology.

Other potential applications could include Earth defense mechanisms that could detect, prevent, or respond to missile attacks, asteroids, and comets, as well as keeping track of satellites and space debris that whiz around Earths orbit.

Additionally, a network of quantum technologies could offer the military security, sensing and timekeeping capabilities not possible with traditional networking approaches, according to the US Army Research Laboratory.

If we take the idea of quantum sensors a step further and into the realm of quantum sensing networks, then we are looking at one component of a quantum internet, when combined with quantum computing.

A quantum internet will be the platform of a quantum ecosystem, where computers, networks, and sensors exchange information in a fundamentally new manner where sensing, communication, and computing literally work together as one entity, Argonne Laboratory senior scientistDavid Awschalom told How Stuff Works.

The notion of a space-based quantum internet using satellite constellations is becoming even more enticing, as evidenced in the joint research paper, Spooky Action at a Global Distance Resource-Rate Analysis of a Space-Based Entanglement-Distribution Network for the Quantum Internet.

According to the scientists, Recent experimental breakthroughs in satellite quantum communications have opened up the possibility of creating a global quantum internet using satellite links, and, This approach appears to be particularly viable in the near term.

The paper seems to describe quantum technologies that are nearly identical to the ones the DIU is looking to build.

Aquantum internet would allow for the execution of other quantum-information-processing tasks, such as quantum teleportation, quantum clock synchronization, distributed quantum computation, and distributedquantum metrology and sensing, it reads.

SpaceX is already building a space-based internet through its Starlink program. Starlink looks to have 12,000 satellites orbiting the earth in a constellation that will beam high-speed internet to even the most remote parts of the planet.

The company led by Elon Musk has already launched some 360 satellites as part of the Starlink constellation.

All the news reports say that Starlink will provide either high-speed or broadband internet, and there are no mentions of SpaceX building a quantum internet, but the idea is an intriguing one.

SpaceX is already working with the Pentagon, the Air Force, NASA, and other government and defense entities.

In 2018, SpaceX won a $28.7 million fixed-price contract from the Air Force Research Laboratory for experiments in data connectivity involving ground sites, aircraft and space assets a project that could give a boost to the companys Starlink broadband satellite service, according to GeekWire.

Lets recap:

By the looks of it, the DIUs space-based quantum sensing prototypes could very well be components of a space-based quantum internet.

However, there has been no announcement from SpaceX saying that Starlink will be beaming down a quantum internet.

At any rate, well soon be looking at high-speed, broadband internet from above in the near future, quantum or otherwise.

Quantum computing: collaboration with the multiverse?

US Energy Dept lays foundation for quantum internet, funds $625M to establish quantum research centers over 5 years

View post:

Pentagon wants commercial, space-based quantum sensors within 2 years - The Sociable

D-Wave makes its quantum computers free to anyone working on the coronavirus crisis – VentureBeat

D-Wave today made its quantum computers available for free to researchers and developers working on responses to the coronavirus (COVID-19) crisis. D-Wave partners and customers Cineca, Denso, Forschungszentrum Jlich, Kyocera, MDR, Menten AI, NEC, OTI Lumionics, QAR Lab at LMU Munich, Sigma-i, Tohoku University, and Volkswagen are also offering to help. They will provide access to their engineering teams with expertise on how to use quantum computers, formulate problems, and develop solutions.

Quantum computing leverages qubits to perform computations that would be much more difficult, or simply not feasible, for a classical computer. Based in Burnaby, Canada, D-Wave was the first company to sell commercial quantum computers, which are built to use quantum annealing. D-Wave says the move to make access free is a response to a cross-industry request from the Canadian government for solutions to the COVID-19 pandemic. Free and unlimited commercial contract-level access to D-Waves quantum computers is available in 35 countries across North America, Europe, and Asia via Leap, the companys quantum cloud service. Just last month, D-Wave debuted Leap 2, which includes a hybrid solver service and solves problems of up to 10,000 variables.

D-Wave and its partners are hoping the free access to quantum processing resources and quantum expertise will help uncover solutions to the COVID-19 crisis. We asked the company if there were any specific use cases it is expecting to bear fruit. D-Wave listed analyzing new methods of diagnosis, modeling the spread of the virus, supply distribution, and pharmaceutical combinations. D-Wave CEO Alan Baratz added a few more to the list.

The D-Wave system, by design, is particularly well-suited to solve a broad range of optimization problems, some of which could be relevant in the context of the COVID-19 pandemic, Baratz told VentureBeat. Potential applications that could benefit from hybrid quantum/classical computing include drug discovery and interactions, epidemiological modeling, hospital logistics optimization, medical device and supply manufacturing optimization, and beyond.

Earlier this month, Murray Thom, D-Waves VP of software and cloud services, told us quantum computing and machine learning are extremely well matched. In todays press release, Prof. Dr. Kristel Michielsen from the Jlich Supercomputing Centre seemed to suggest a similar notion: To make efficient use of D-Waves optimization and AI capabilities, we are integrating the system into our modular HPC environment.

Read the original:

D-Wave makes its quantum computers free to anyone working on the coronavirus crisis - VentureBeat

Can Quantum Computing Be the New Buzzword – Analytics Insight

Quantum Mechanics created their chapter in the history of the early 20th Century. With its regular binary computing twin going out of style, quantum mechanics led quantum computing to be the new belle of the ball! While the memory used in a classical computer encodes binary bits one and zero, quantum computers use qubits (quantum bits). And Qubit is not confined to a two-state solution, but can also exist in superposition i.e., qubits can be employed at 0, 1 and both 1 and 0 at the same time.

Hence it can perform many calculations in parallel owing to the ability to pursue simultaneous probabilities through superposition along with manipulating them with magnetic fields. Its coefficients allow predicting how much zero-ness and one-ness it has, are complex numbers, which indicates the real and imaginary part. This provides a huge technical edge over other conventional computing. The beauty of this is if you have n qubits, you can have a superposition of 2n states or bits of information simultaneously.

Another magic up its sleeve is that Qubits are capable of pairing which is referred to as entanglement. Here, the state of one qubit cannot be described independently of the state of the others which allows instantaneous communication.

To quote American theoretical physicist, John Wheeler, If you are not completely confused by quantum mechanics, you do not understand it. So, without a doubt it is safe to say that even quantum computing has few pitfalls. First, the qubits tend to loss the information they contain, and also lose their entanglement in other words, decoherence. Second, imperfections of quantum rotations. These led to a loss of information within a few microsecond.

Ultimately, quantum computing is the Trump Card as promises to be a disruptive technology with such dramatic speed improvements. This will enable systems to solve complex higher-order mathematical problems that earlier took months to be computed, investigate material properties, design new ones, study superconductivity, aid in drug discovery via simulation and understanding new chemical reactions.

This quantum shift in the history of computer sciences can also pave way for encrypted communication (as keys cannot be copied nor hacked), much better than Blockchain technology, provide improved designs for solar panels, predict financial markets, big data mining, develop Artificial Intelligence to new heights, enhanced meteorological updates and a much-anticipated age of quantum internet. According to scientists, Future advancements can also lead to help find a cure for Alzheimers.

The ownership and effective employment of a quantum computer could change the political and technological dynamics of the world. Computing power, in the end, is power whether it is personal, national or globally strategic. In short, a quantum computer could be an existential threat to a nation that hasnt got one. At the moment Google, IBM, Intel, and D-Wave are pursuing this technology. While there are scientific minds who dont believe in the potential of quantum computing yet unless you are a time-traveler like Marty McFly in Back to the Future series or any one of the Doctor Who, one cannot say what future beholds.

Read the original post:

Can Quantum Computing Be the New Buzzword - Analytics Insight

Q-CTRL to Host Live Demos of ‘Quantum Control’ Tools – Quantaneo, the Quantum Computing Source

Q-CTRL, a startup that applies the principles of control engineering to accelerate the development of the first useful quantum computers, will host a series of online demonstrations of new quantum control tools designed to enhance the efficiency and stability of quantum computing hardware.

Dr. Michael Hush, Head of Quantum Science and Engineering at Q-CTRL, will provide an overview of the companys cloud-based quantum control engineering software called BOULDER OPAL. This software uses custom machine learning algorithms to create error-robust logical operations in quantum computers. The team will demonstrate - using real quantum computing hardware in real time - how they reduce susceptibility to error by 100X and improve hardware stability in time by 10X, while reducing time-to-solution by 10X against existing software.

Scheduled to accommodate the global quantum computing research base, the demonstrations will take place:

April 16 from 4-4:30 p.m. U.S. Eastern Time (ET) April 21 from 10-10:30 a.m. Singapore Time (SGT) April 23 from 10-10:30 a.m. Central European Summer Time (CEST) To register, visit https://go.q-ctrl.com/l/791783/2020-03-19/dk83

Released in Beta by Q-CTRL in March, BOULDER OPAL is an advanced Python-based toolkit for developers and R&D teams using quantum control in their hardware or theoretical research. Technology agnostic and with major computational grunt delivered seamlessly via the cloud, BOULDER OPAL enables a range of essential tasks which improve the performance of quantum computing and quantum sensing hardware. This includes the efficient identification of sources of noise and error, calculating detailed error budgets in real lab environments, creating new error-robust logic operations for even the most complex quantum circuits, and integrating outputs directly into real hardware.

The result for users is greater performance from todays quantum computing hardware, without the need to become an expert in quantum control engineering.

Experimental validations and an overview of the software architecture, developed in collaboration with the University of Sydney, were recently released in an online technical manuscript titled Software Tools for Quantum Control: Improving Quantum Computer Performance through Noise and Error Suppression.

Go here to read the rest:

Q-CTRL to Host Live Demos of 'Quantum Control' Tools - Quantaneo, the Quantum Computing Source

We’re Getting Closer to the Quantum Internet, But What Is It? – HowStuffWorks

Advertisement

Back in February 2020, scientists from the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago revealed that they had achieved a quantum entanglement in which the behavior of a pair two tiny particles becomes linked, so that their states are identical over a 52-mile (83.7 kilometer) quantum-loop network in the Chicago suburbs.

You may be wondering what all the fuss is about, if you're not a scientist familiar with quantum mechanics that is, the behavior of matter and energy at the smallest scale of reality, which is peculiarly different from the world we can see around us.

But the researchers' feat could be an important step in the development of a new, vastly more powerful version of the internet in the next few decades. Instead of the bits that today's network uses, which can only express a value of either 0 or 1, the future quantum internet would utilize qubits of quantum information, which can take on an infinite number of values. (A quibit is the unit of information for a quantum computer; it's like a bit in an ordinary computer).

That would give the quantum internet way more bandwidth, which would make it possible to connect super-powerful quantum computers and other devices and run massive applications that simply aren't possible with the internet we have now.

"A quantum internet will be the platform of a quantum ecosystem, where computers, networks, and sensors exchange information in a fundamentally new manner where sensing, communication, and computing literally work together as one entity, " explains David Awschalom via email. He's a spintronics and quantum information professor in the Pritzker School of Molecular Engineering at the University of Chicago and a senior scientist at Argonne, who led the quantum-loop project.

So why do we need this and what does it do? For starters, the quantum internet is not a replacement of the regular internet we now have. Rather it would be a complement to it or a branch of it. It would be able to take care of some of the problems that plague the current internet. For instance, a quantum internet would offer much greater protection from hackers and cybercriminals. Right now, if Alice in New York sends a message to Bob in California over the internet, that message travels in more or less a straight line from one coast to the other. Along the way, the signals that transmit the message degrade; repeaters read the signals, amplify and correct the errors. But this process allows hackers to "break in" and intercept the message.

However, a quantum message wouldn't have that problem. Quantum networks use particles of light photons to send messages which are not vulnerable to cyberattacks. Instead of encrypting a message using mathematical complexity, says Ray Newell, a researcher at Los Alamos National Laboratory, we would rely upon the peculiar rules of quantum physics. With quantum information, "you can't copy it or cut it in half, and you can't even look at it without changing it." In fact, just trying to intercept a message destroys the message, as Wired magazine noted. That would enable encryption that would be vastly more secure than anything available today.

"The easiest way to understand the concept of the quantum internet is through the concept of quantum teleportation," Sumeet Khatri, a researcher at Louisiana State University in Baton Rouge, says in an email. He and colleagues have written a paper about the feasibility of a space-based quantum internet, in which satellites would continually broadcast entangled photons down to Earth's surface, as this Technology Review article describes.

"Quantum teleportation is unlike what a non-scientist's mind might conjure up in terms of what they see in sci-fi movies, " Khatri says. "In quantum teleportation, two people who want to communicate share a pair of quantum particles that are entangled. Then, through a sequence of operations, the sender can send any quantum information to the receiver (although it can't be done faster than light speed, a common misconception). This collection of shared entanglement between pairs of people all over the world essentially constitutes the quantum internet. The central research question is how best to distribute these entangled pairs to people distributed all over the world. "

Once it's possible to do that on a large scale, the quantum internet would be so astonishingly fast that far-flung clocks could be synchronized about a thousand times more precisely than the best atomic clocks available today, as Cosmos magazine details. That would make GPS navigation vastly more precise than it is today, and map Earth's gravitational field in such detail that scientists could spot the ripple of gravitational waves. It also could make it possible to teleport photons from distant visible-light telescopes all over Earth and link them into a giant virtual observatory.

"You could potentially see planets around other stars, " says Nicholas Peters, group leader of the Quantum Information Science Group at Oak Ridge National Laboratory.

It also would be possible for networks of super-powerful quantum computers across the globe to work together and create incredibly complex simulations. That might enable researchers to better understand the behavior of molecules and proteins, for example, and to develop and test new medications.

It also might help physicists to solve some of the longstanding mysteries of reality. "We don't have a complete picture of how the universe works," says Newell. "We have a very good understanding of how quantum mechanics works, but not a very clear picture of the implications. The picture is blurry where quantum mechanics intersects with our lived experience."

But before any of that can happen, researchers have to figure out how to build a quantum internet, and given the weirdness of quantum mechanics, that's not going to be easy. "In the classical world you can encode information and save it and it doesn't decay, " Peters says. "In the quantum world, you encode information and it starts to decay almost immediately. "

Another problem is that because the amount of energy that corresponds to quantum information is really low, it's difficult to keep it from interacting with the outside world. Today, "in many cases, quantum systems only work at very low temperatures," Newell says. "Another alternative is to work in a vacuum and pump all the air out. "

In order to make a quantum internet function, Newell says, we'll need all sorts of hardware that hasn't been developed yet. So it's hard to say at this point exactly when a quantum internet would be up and running, though one Chinese scientist has envisioned that it could happen as soon as 2030.

Read more here:

We're Getting Closer to the Quantum Internet, But What Is It? - HowStuffWorks

Devs: Alex Garland on Tech Company Cults, Quantum Computing, and Determinism – Den of Geek UK

Yet that difference between the common things a company can sell and the uncommon things they quietly develop is profoundly important. In Devs, the friendly exterior of Amaya with its enormous statue of a childa literal monument to Forests lost daughteris a public face to the actual profound work his Devs team is doing in a separate, highly secretive facility. Seemingly based in part on mysterious research and development wings of tech giantsthink Googles moonshot organizations at X Development and DeepMindDevs is using quantum computing to change the world, all while keeping Forests Zen ambition as its shield.

I think it helps, actually, Garland says about Forest not being a genius. Because I think what happens is that these [CEO] guys present as a kind of front between what the company is doing and the rest of the world, including the kind of inspection that the rest of the world might want on the company if they knew what the company was doing. So our belief and enthusiasm in the leader stops us from looking too hard at what the people behind-the-scenes are doing. And from my point of view thats quite common.

A lifelong man of words, Garland describes himself as a writer with a laymans interest in science. Yet its fair to say he studies almost obsessively whatever field of science hes writing about, which now pertains to quantum computing. A still largely unexplored frontier in the tech world, quantum computing is the use of technology to apply quantum-mechanical phenomena to data a traditional computer could never process. Its still so unknown that Google AI and NASA published a paper only six months ago in which they claimed to have achieved quantum supremacy (the creation of a quantum device that can actually solve problems a classical computer cannot).

Whereas binary computers work with gates that are either a one or a zero, a quantum qubit [a basic unit of measurement] can deal with a one and a zero concurrently, and all points in between, says Garland. So you get a staggering amount of exponential power as you start to run those qubits in tandem with each other. What the filmmaker is especially fascinated by is using a quantum system to model another quantum system. That is to say using a quantum computer with true supremacy to solve other theoretical problems in quantum physics. If we use a binary way of doing that, youre essentially using a filing system to model something that is emphatically not binary.

So in Devs, quantum computing is a gateway into a hell of a trippy concept: a quantum computer so powerful that it can analyze the theoretical data of everything that has or will occur. In essence, Forest and his team are creating a time machine that can project through a probabilistic system how events happened in the past, will happen in the future, and are happening right now. It thus acts as an omnipotent surveillance system far beyond any neocons dreams.

Read the rest here:

Devs: Alex Garland on Tech Company Cults, Quantum Computing, and Determinism - Den of Geek UK

Who Will Mine Cryptocurrency in the Future – Quantum Computers or the Human Body? – Coin Idol

Apr 01, 2020 at 09:31 // News

Companies including Microsoft, IBM and Google, race to come up with cheap and effective mining solutions to improve its cost and energy efficiency. Lots of fuss has been made around quantum computing and its potential for mining. Now, the time has come for a new solution - mining with the help of human body activity.

While quantum computers are said to be able to hack bitcoin mining algorithms, using physical activity for the process is quite a new and extraordinary thing. The question is, which technology turns out to be more efficient?

Currently, with the traditional cryptocurrency mining methods, the reward for mining a bitcoin block is around 12.5 bitcoins, at $4k per BTC and this should quickly be paid off after mining a few blocks.

Consequently, the best mining method as per now is to keep trying random numbers and wait to observe which one hashes to a number that isnt more than the target difficulty. And this is one of the reasons as to why mining pools have arisen where multiple PCs are functioning in parallel to look for the proper solution to the problem and if one of the PCs gets the solution, then the pool is given an appropriate reward which is then shared among all the miners.

Quantum computers possess more capacity and might potentially be able to significantly speed up mining while eliminating the need for numerous machines. Thus, it can improve both energy efficiency and the speed of mining.

In late 2019, Google released a quantum processor called Sycamore, many times faster than the existing supercomputer. There was even a post in the medium claiming that this new processor is able to mine all remaining bitcoins like in two seconds. Sometime later the post was deleted due to an error in calculations, according to the Bitcoinist news outlet.

Despite quantum computing having the potential to increase the efficiency of mining, its cost is close to stratospheric. It would probably take time before someone is able to afford it.

Meanwhile, another global tech giant, Microsoft, offers a completely new and extraordinary solution - to mine cryptos using a persons brain waves or body temperature. As coinidol.com, a world blockchain news outlet has reported, they have filed a patent for a groundbreaking system which can mine digital currencies using the data collected from human beings when they view ads or do exercises.

The IT giant disclosed that sensors could identify and diagnose any activity connected with the particular piece(s) of work like the time taken to read advertisements, and modify it into digital information that is readable by a computing device to do computation works, the same manner as a conventional proof-of-work (PoW) system works. Some tasks would either decrease or soar computational energy in an appropriate manner, basing on the produced amount of info from the users activity.

So far, there is no signal showing when Microsoft will start developing the system and it is still uncertain whether or not this system will be developed on its own blockchain network. Quantum computing also needs time to be fully developed and deployed.

However, both solutions bear a significant potential for transforming the entire mining industry. While quantum computing is able to boost the existing mining mechanism, having eliminated high energy-consuming mining firms, Microsofts new initiative can disrupt the industry making it even look different.

Which of these two solutions turns out to be more viable? We will see over time. What do you think about these mining solutions? Let us know in the comments below!

Read more:

Who Will Mine Cryptocurrency in the Future - Quantum Computers or the Human Body? - Coin Idol

The Schizophrenic World Of Quantum Interpretations – Forbes

Quantum Interpretations

To the average person, most quantum theories sound strange, while others seem downright bizarre.There are many diverse theories that try to explain the intricacies of quantum systems and how our interactions affect them.And, not surprisingly, each approach is supported by its group of well-qualified and well-respected scientists.Here, well take a look at the two most popular quantum interpretations.

Does it seem reasonable that you can alter a quantum system just by looking at it? What about creating multiple universes by merely making a decision?Or what if your mind split because you measured a quantum system?

You might be surprised that all or some of these things might routinely happen millions of times every day without you even realizing it.

But before your brain gets twisted into a knot, lets cover a little history and a few quantum basics.

The birth of quantum mechanics

Classical physics describes how large objects behave and how they interact with the physical world.On the other hand, quantum theory is all about the extraordinary and inexplicable interaction of small particles on the invisible scale of such things as atoms, electrons, and photons.

Max Planck, a German theoretical physicist, first introduced the quantum theory in 1900. It was an innovation that won him the Nobel Prize in physics in 1918.Between 1925 and 1930, several scientists worked to clarify and understand quantum theory.Among the scientists were Werner Heisenberg and Erwin Schrdinger, both of whom mathematically expanded quantum mechanics to accommodate experimental findings that couldnt be explained by standard physics.

Heisenberg, along with Max Born and Pascual Jordan, created a formulation of quantum mechanics called matrix mechanics. This concept interpreted the physical properties of particles as matrices that evolved in time.A few months later, Erwin Schrdinger created his famous wave mechanics.

Although Heisenberg and Schrdinger worked independently from each other, and although their theories were very different in presentation, both theories were essentially mathematically the same. Of the two formulations, Schrdingers was more popular than Heisenbergs because it boiled down to familiar differential equations.

While today's physicists still use these formulations, they still debate their actual meaning.

First weirdness

A good place to start is Schrdingers equation.

Erwin Schrdingers equation provides a mathematical description of all possible locations and characteristics of a quantum system as it changes over time.This description is called the systems wave function.According to the most common quantum theory, everything has a wave function. The quantum system could be a particle, such as an electron or a photon, or even something larger.

Schrdingers equation won't tell you the exact location of a particle.It only reveals the probability of finding the particle at a given location.The probability of a particle being in many places or in many states at the same time is called its superposition. Superposition is one of the elements of quantum computing that makes it so powerful.

Almost everyone has heard about Schrdingers cat in a box.Simplistically, ignoring the radiation gadgets, while the cat is in the closed box, it is in a superposition of being both dead and alive at the same time.Opening the box causes the cat's wave function to collapse into one of two states and you'll find the cat either alive or dead.

There is little dispute among the quantum community that Schrdingers equation accurately reflects how a quantum wave function evolves.However, the wave function itself, as well as the cause and consequences of its collapse, are all subjects of debate.

David Deutsch is a brilliant British quantum physicist at the University of Cambridge. In his book, The Fabric of Reality, he said: Being able to predict things or to describe them, however accurately, is not at all the same thing as understanding them. Facts cannot be understood just by being summarized in a formula, any more than being listed on paper or committed to memory.

The Copenhagen interpretation

Quantum theories use the term "interpretation" for two reasons.One, it is not always obvious what a particular theory means without some form of translation.And, two, we are not sure we understand what goes on between a wave functions starting point and where it ends up.

There are many quantum interpretations.The most popular is the Copenhagen interpretation, a namesake of where Werner Heisenberg andNiels Bohr developed their quantum theory.

Werner Heisenberg (left) with Niels Bohr at a Conference in Copenhagen in 1934.

Bohr believed that the wave function of a quantum system contained all possible quantum states.However, when the system was observed or measured, its wave function collapsed into a single state.

Whats unique about the Copenhagen interpretation is that it makes the outside observer responsible for the wave functions ultimate fate. Almost magically, a quantum system, with all its possible states and probabilities, has no connection to the physical world until an observer interacts or measures the system. The measurement causes the wave function to collapse into one of its many states.

You might wonder what happens to all the other quantum states present in the wave function as described by the Copenhagen Interpretation before it collapsed?There is no explanation of that mystery in the Copenhagen interpretation. However, there is a quantum interpretation that provides an answer to that question.Its called the Many-Worlds Interpretation or MWI.

Billions of you?

Because the many-worlds interpretation is one of the strangest quantum theories, it has become central to the plot of many science fiction novels and movies.At one time, MWI was an outlier with the quantum community, but many leading physicists now believe it is the only theory that is consistent with quantum behavior.

The MWI originated in a Princeton doctoral thesis written by a young physicist named Hugh Everett in the late 1950s. Even though Everett derived his theory using sound quantum fundamentals, it was severely criticized and ridiculed by most of the quantum community. Even Everetts academic adviser at Princeton, John Wheeler, tried to distance himself from his student. Everette became despondent over the harsh criticism. He eventually left quantum research to work for the government as a mathematician.

The theory proposes that the universe has a single, large wave function that follows Schrdingers equation.Unlike the Copenhagen Interpretation, the MWI universal wave function doesnt collapse.

Everything in the universe is quantum, including ourselves. As we interact with parts of the universe, we become entangled with it.As the universal wave function evolves, some of our superposition states decohere. When that happens, our reality becomes separated from the other possible outcomes associated with that event. Just to be clear, the universe doesn't split and create a new universe. The probability of all realities, or universes, already exists in the universal wave function, all occupying the same space-time.

Schrdinger's Cat, many-worlds interpretation, with universe branching. Visualization of the ... [+] separation of the universe due to two superposed and entangled quantum mechanical states.

In the Copenhagen interpretation, by opening the box containing Schrdingers cat, you cause the wave function to collapse into one of its possible states, either alive or dead.

In the Many -Worlds interpretation, the wave function doesn't collapse. Instead, all probabilities are realized.In one universe, you see the cat alive, and in another universe the cat will be dead.

Right or wrong decisions become right and wrong decisions

Decisions are also events that trigger the separation of multiple universes. We make thousands of big and little choices every day. Have you ever wondered what your life would be like had you made different decisions over the years?

According to the Many-Worlds interpretation, you and all those unrealized decisions exist in different universes because all possible outcomes exist in the universal wave function.For every decision you make, at least two of "you" evolve on the other side of that decision. One universe exists for the choice you make, and one universe for the choice you didnt make.

If the Many-Worlds Interpretation is correct, then right now, a near infinite versions of you are living different and independent lives in their own universes.Moreover, each of the universes overlay each other and occupy the same space and time.

It is also likely that you are currently living in a branch universe spun off from a decision made by a previous version of yourself, perhaps millions or billions of previous iterations ago.You have all the old memories of your pre-decision self, but as you move forward in your own universe, you live independently and create your unique and new memories.

A Reality Check

Which interpretation is correct?Copenhagen or Many-Worlds?Maybe neither. But because quantum mechanics is so strange, perhaps both are correct.It is also possible that a valid interpretation is yet to be expressed. In the end, correct or not, quantum interpretations are just plain fun to think about.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Disclosure: Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including Amazon.com, Advanced Micro Devices,Apstra,ARM Holdings, Aruba Networks, AWS, A-10 Strategies,Bitfusion,Cisco Systems, Dell, DellEMC, Dell Technologies, Diablo Technologies, Digital Optics,Dreamchain, Echelon, Ericsson, Foxconn, Frame, Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries,Google,HPInc., Hewlett Packard Enterprise, HuaweiTechnologies,IBM, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MACOM (Applied Micro),MapBox,Mavenir, Mesosphere,Microsoft,National Instruments, NetApp, NOKIA, Nortek,NVIDIA, ON Semiconductor, ONUG, OpenStack Foundation, Panasas,Peraso, Pixelworks, Plume Design,Portworx, Pure Storage,Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat, Samsung Electronics, Silver Peak, SONY,Springpath, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse,TensTorrent,TobiiTechnology, Twitter, Unity Technologies, Verizon Communications,Vidyo, Wave Computing,Wellsmith, Xilinx, Zebra, which may be cited in this article.

Originally posted here:

The Schizophrenic World Of Quantum Interpretations - Forbes

Disrupt The Datacenter With Orchestration – The Next Platform

Since 1965, the computer industry has relied on Moores Law to accelerate innovation, pushing more transistors into integrated circuits to improve computation performance. Making transistors smaller helped lift all boats for the entire industry and enable new applications. At some point, we will reach a physical limit that is, a limit stemming from physics itself. Even with this setback, improvements kept on pace thanks to increased parallelism of computation and consolidation of specialized functions into single chip packages, such as systems on chip).

In recent years, we are nearing another peak. This article proposes to improve computation performance not only by building better hardware, but by changing how we use existing hardware. More specifically, the focusing on how we use existing processor types. I call this approach Compute Orchestration: automatic optimization of machine code to best use the modern datacenter hardware (again, with special emphasis on different processor types).

So what is compute orchestration? It is the embracing of hardware diversity to support software.

There are many types of processors: Microprocessors in small devices, general purpose CPUs in computers and servers, GPUs for graphics and compute, and programmable hardware like FPGAs. In recent years, specialized processors like TPUs and neuromorphic processors for machine learning are rapidly entering the datacenter.

There is potential in this variety: Instead of statically utilizing each processor for pre-defined functions, we can use existing processors as a swarm, each processor working on the most suitable workloads. Doing that, we can potentially deliver more computation bandwidth with less power, lower latency and lower total cost of ownership).

Non-standard utilization of existing processors is already happening: GPUs, for example, were already adapted from processors dedicated to graphics into a core enterprise component. Today, GPUs are used for machine learning and cryptocurrency mining, for example.

I call the technology to utilize the processors as a swarm Compute Orchestration. Its tenets can be described in four simple bullets:

Compute orchestration is, in short, automatic adaptation of binary code and automatic allocation to the most suitable processor types available. I split the evolution of compute orchestration into four generations:

Compute Orchestration Gen 1: Static Allocation To Specialized Co-Processors

This type of compute orchestration is everywhere. Most devices today include co-processors to offload some specialized work from the CPU. Usually, the toolchain or runtime environment takes care of assigning workloads to the co-processor. This is seamless to the developer, but also limited in functionality.

Best known example is the use of cryptographic co-processors for relevant functions. Being liberal in our definitions of co-processor, Memory Management Units (MMUs) to manage virtual memory address translation can also be considered an example.

Compute Orchestration Gen 2: Static Allocation, Heterogeneous Hardware

This is where we are at now. In the second generation, the software relies on libraries, dedicated run time environments and VMs to best use the available hardware. Lets call the collection of components that help better use the hardware frameworks. Current frameworks implement specific code to better use specific processors. Most prevalent are frameworks that know how to utilize GPUs in the cloud. Usually, better allocation to bare metal hosts remains the responsibility of the developer. For example, the developer/DevOps engineer needs to make sure a machine with GPU is available for the relevant microservice. This phenomenon is what brought me to think of Compute Orchestration in the first place, as it proves there is more slack in our current hardware.

Common frameworks like OpenCL allow programming compute kernels to run on different processors. TensorFlow allows assigning nodes in a computation graph to different processors (devices).

This better use of hardware by using existing frameworks is great. However, I believe there is a bigger edge. Existing frameworks still require effort from the developer to be optimal they rely on the developer. Also, no legacy code from 2016 (for example) is ever going to utilize a modern datacenter GPU cluster. My view is that by developing automated and dynamic frameworks, that adapt to the hardware and workload, we can achieve another leap.

Compute Orchestration Gen 3: Dynamic Allocation To Heterogeneous Hardware

Computation can take an example from the storage industry: Products for better utilization and reliability of storage hardware have innovated for years. Storage startups develop abstraction layers and special filesystems that improve efficiency and reliability of existing storage hardware. Computation, on the other hand, remains a stupid allocation of hardware resources. Smart allocation of computation workloads to hardware could result in better performance and efficiency for big data centers (for example hyperscalers like cloud providers). The infrastructure for such allocation is here, with current data center designs pushing to more resource disaggregation, introduction of diverse accelerators, and increased work on automatic acceleration (for example: Workload-aware Automatic Parallelization for Multi-GPU DNN Training).

For high level resource management, we already have automatic allocation. For example, project Mesos (paper) focusing on fine-grained resource sharing, Slurm for cluster management, and several extensions using Kubernetes operators.

To further advance from here would require two steps: automatic mapping of available processors (which we call the compute environment) and workload adaptation. Imagine a situation where the developer doesnt have to optimize her code to the hardware. Rather, the runtime environment identifies the available processing hardware and automatically optimizes the code. Cloud environments are heterogeneous and changing, and the code should change accordingly (in fact its not the code, but the execution model in the run time environment of the machine code).

Compute Orchestration Gen 4: Automatic Allocation To Dynamic Hardware

A thought, even a possibility, can shatter and transform us. Friedrich Wilhelm Nietzsche

The quote above is to say that there we are far from practical implementation of the concept described here (as far as I know). We can, however, imagine a technology that dynamically re-designs a data center to serve needs of running applications. This change in the way whole data centers meet computation needs as already started. FGPAs are used more often and appear in new places (FPGAs in hosts, FPGA machines in AWS, SmartNICs), providing the framework for constant reconfiguration of hardware.

To illustrate the idea, I will use an example: Microsoft initiated project Catapult, augmenting CPUs with an interconnected and configurable compute layer composed of programmable silicon. The timeline in the projects website is fascinating. The project started off in 2010, aiming to improve search queries by using FPGAs. Quickly, it proposed the use of FPGAs as bumps in the wire, adding computation in new areas of the data path. Project Catapult also designed an architecture for using FPGAs as a distributed resource pool serving all the data center. Then, the project spun off Project BrainWave, utilizing FPGAs for accelerating AI/ML workloads.

This was just an example of innovation in how we compute. Quick online search will bring up several academic works on the topic. All we need to reach the 4th generation is some idea synthesis, combining a few concepts together:

Low effort HDL generation (for example Merlin compiler, BORPH)

In essence, what I am proposing is to optimize computation by adding an abstraction layer that:

Automatic allocation on agile hardware is the recipe for best utilizing existing resources: faster, greener, cheaper.

The trends and ideas mentioned in this article can lead to many places. It is very likely, that we are already working with existing hardware in the optimal way. It is my belief that we are in the midst of the improvement curve. In recent years, we had increased innovation in basic hardware building blocks, new processors for example, but we still have room to improve in overall allocation and utilization. The more we deploy new processors in the field, the more slack we have in our hardware stack. New concepts, like edge computing and resource disaggregation, bring new opportunities for optimizing legacy code by smarter execution. To achieve that, legacy code cant be expected to be refactored. Developers and DevOps engineers cant be expected to optimize for the cloud configuration. We just need to execute code in a smarter way and that is the essence of compute orchestration.

The conceptual framework described in this article should be further explored. We first need to find the killer app (what type of software we optimize to which type of hardware). From there, we can generalize. I was recently asked in a round table what is the next generation of computation? Quantum computing? Tensor Processor Units? I responded that all of the above, but what we really need is better usage of the existing generation.

Guy Harpak is the head of technology at Mercedes-Benz Research & Devcelopment in its Tel Aviv, Israel facility. Please feel free to contact him on any thoughts on the topics above at harpakguy@gmail.com. Harpak notes that this contributed article reflects his personal opinion and is in no way related to people or companies that he works with or for.

Related Reading: If you find this article interesting, I would recommend researching the following topics:

Some interesting articles on similar topics:

Return Of The Runtimes: Rethinking The Language Runtime System For The Cloud 3.0 Era

The Deep Learning Revolution And Its Implications For Computer Architecture And Chip Design (by Jeffrey Dean from Google Research)

Beyond SmartNICs: Towards A Fully Programmable Cloud

Hyperscale Cloud: Reimagining Datacenters From Hardware To Applications

Read more:

Disrupt The Datacenter With Orchestration - The Next Platform

1000 Words or So About The New QuantumAI Scam – TechTheLead

To most of us, Elon Musk is the real-life embodiment of Tony Stark. He started from nothing more or less and now is a leader in some aspects of the tech world. He started small, with an archive of newspapers and magazines, and in no time he was in Space. Space X that is.

So what is quantum computing? In a nutshell, if you network all the PCs on the planet right now, and put them to work, the resulting power would not be sufficient to run the complex calculation that a quantum computer can permutate.

Now, Elon has decided to notably withdraw from operating Tesla and SpaceX, and move to the next big chapter in his life: Quantum Computing, a venture that has seen investments over 2 billion dollars in just the prior to these years.

On the other hand, Elon never announced anything on Twitter, and that made us wonder. He left SpaceX and Tesla for this? It must be a scam! And it was. One that wants data and personal info.

On top of that if some untrusted sources are to be believed, the project is LIVE right now,beating companies like Microsoft and IBM to the punchand delivering QuantumAI. Or how Elon puts it: A new way to redistributing the worlds wealth.

The scammers strongly believe that the 1% that controls 90% of the worlds financial capital can share and help normal people to grow in wealth by using quantum computing. This theory has been thrown around even in the times ofMoore, and it now seems to be a reality for everybody on Earth.

This time the greedy scammers have raised the bar. The scam group used real footage of Elon Musk talking about his companies, but they overimposed another audio, making sure to turn people to the fake QuantumAI investment platform and automated trading app.

The group responsible for masterminding this charade are part of a bigger affiliate network, and they specialize in social media advertising like Facebook and Twitter. These networks are operating in cooperation with rogue offshore brokers who are paying referral money for investing clients. You are the investing client in this case!

And the rabbit hole goes deeper. When you sign in, you are signed up for your broker, in this case, Crypto Kartal owned by Elmond Enterprise Ltd. A company that is located in St. Vincent and the Grenadines as well as an office in Estonia where it is named Fukazawa Partnership OU. The QuantumAI scam is particularly shoddy because it practices an aggregate of two highly effective baiting systems: social media and video manipulation. And Facebook and Twitter are disseminating the message right now in some regions.

According to the scam, this iteration of QuantumAI hopes to make people 2 3 times wealthier, and no one, except the super-wealthy, will take a hit.

How do they do that? Well, the process is simpler than you can imagine. The wealthy keep their investments in bonds and stocks that they trade for a profit on the open market.

Here is the part where QuantumAI makes a power move that can affect the super-rich. The scam promises to beat Wall Street traders to the market, making winning trades before the brokers can react or intercept transactions. And with a quantum computer, you can do that! Well, as long as you have a working quantum computer that is!

Sounds super interesting right? But its a scam! All you need to do is do a Google back search of the pictures on the website and maybe, try to find out if the brokers invested in this enterprise had any scam or alerts in the past. Doubt anything, back search anything before you input any of your data, and on top of it all NEVER use your main email and password. Its safer to use a program or create a new one, just to be safe.

Be careful in these times. The Quantum AI Scam software, app, and fraudulent crypto trading platform by Elon Musk is completely blacklisted. But Facebook runs the adds with no remorse, the scammers switching between fake Guardian or CNN articles. So be aware!

See the article here:

1000 Words or So About The New QuantumAI Scam - TechTheLead

Quantum Computing strikes technology partnership with Splunk – Proactive Investors USA & Canada

Initial efforts with San Franciscos Splunk will focus on three key challenges: network security, dynamic logistics and scheduling

Quantum Computing Inc (OTC:QUBT), an advanced technology company developing quantum-ready applications and tools, said Tuesday that it has struck a technology alliance partnership with ().

San Francisco, California-based Splunk creates software for searching, monitoring, and analyzing machine-generated big data via a web-style interface.

Meanwhile, staffed by experts in mathematics, quantum physics, supercomputing, financing and cryptography, Leesburg, Virginia-based Quantum Computing is developing an array of applications to allow companies to exploit the power of quantum computing to their advantage. It is a leader in the development of quantum ready software with deep experience developing applications and tools for early quantum computers.

Splunk brings a leading big-data-analytics platform to the partnership, notably existing capabilities in its Machine/Deep Learning Toolkit in current use by Splunk customers, said the company.

Implementation of quantum computing applications will be significantly accelerated by tools that allow the development and execution of applications independent of any particular quantum computing architecture.

We are excited about this partnership opportunity, said Quantum Computing CEO Robert Liscouski. Splunk is a proven technology leader with over 17,500 customers world-wide, that has the potential to provide great opportunities for QCIs quantum ready software technologies.

Both the companies will partner to do fundamental and applied research and develop analytics that exploit conventional large-data cybersecurity stores and data-analytics workflows, combined with quantum-ready graph and constrained-optimization algorithms.

The company explained that these algorithms will initially be developed using Quantums Mukai software platform, which enables quantum-ready algorithms to execute on classical hardware and also to run without modification on quantum computing hardware when ready.

Once proofs of concept are completed, QCI and Splunk will develop new analytics with these algorithms in the Splunk data-analytics platform, to evaluate quantum analytics readiness on real-world data, noted the company.

The Splunk platform/toolkits help customers address challenging analytical problems via neural nets or custom algorithms, extensible to Deep Learning frameworks through an open source approach that incorporates existing and custom libraries.

The initial efforts of our partnership with Splunk will focus on three key challenges: network security, dynamic logistics and scheduling, said Quantum Computing.

Contact the author Uttara Choudhury at[emailprotected]

Follow her onTwitter:@UttaraProactive

Read more:

Quantum Computing strikes technology partnership with Splunk - Proactive Investors USA & Canada

Faster, better, stronger: The next stage of global communications networks – Siliconrepublic.com

Prof Bogdan Staszewski from UCDs IoE2 Lab looks at the future of global communications, from faster networks and more powerful computing to the challenges of energy and cybersecurity.

I am an engineer and an engineers job is to design new solutions for building and making things. Engineers concern ourselves with what goes on below the surface, with the building blocks that make up the world in which we live and work, which is constantly evolving.

As electrical and electronics engineers, my colleagues and I work in a microscopic world of integrated circuits the hardware at the deepest level of the networks with which we interact every day and on which we have come to rely.

From global communications to the movement of money, we rely on the fast and secure transmission of quintillions of bits of data every day

Life today revolves around these networks. From global communications to the movement of money, we rely on the fast and secure transmission of quintillions of bits of data every day. And as technological and economic progress is made, there are ever more demands for capacity in these networks, and for ever greater speed, efficiency and security.

The possibilities created by increased connectedness has led to simple but profound challenges. In network terms, how to send the greatest amount of data in the shortest time while reducing the power requirement and cost, is chief among them.

Internet of things (IoT) networks are helping to address major societal challenges. Water regulation in agriculture in drought regions such as California, and dyke and canal infrastructure management in the Netherlands are just two examples. The systems underpinned by networks of sensors and microprocessors, capable of wireless connectivity and energy scavenging have vastly improved efficiency and delivered numerous benefits.

We are looking to even more advanced applications of these technologies, such as autonomously driven vehicles and robotic surgery. We are designing technology that could either completely replace humans or watch and take over when the driver or surgeon gets too tired or distracted.

We are envisaging vehicles that can communicate among themselves and a traffic coordinator to ensure smooth traffic flow with no need for traffic lights. We are preparing for autonomous operating rooms where robotic surgeons can be directed remotely by human surgeons in another country.

This is technology that could deliver superior and safer performance than error-prone human operation, but which is entirely dependent on unimpeachable network speed, efficiency and security that has not yet been achieved.

It is predicted that connected autonomously driven vehicles will eliminate traffic and accidents. We can imagine insurance premiums going down substantially. Of course, we can also imagine an utter disaster if a hacker was able to sneak into these networks, or if an uplink failed at the wrong moment while crucial information was being transmitted.

Hence, the network must be super fast, super secure and have enough bandwidth.

The view from the core of this technology offers a unique perspective on these challenges. Like physicists and geneticists, electrical and electronics engineers look for answers in ever smaller parts inside our networks.

Energy supply and consumption is at the heart of big societal challenges and so too is it one of the most critical considerations for IoT applications. My colleagues and I in the IoE2 Lab at University College Dublin are currently tackling this problem using the latest nanoscale CMOS (complementary metal oxide semiconductor) technologies, in pursuit of a common ultra-low-power system-on-chip hardware platform.

This means an integrated computer and electronics system containing a CPU, memory, and digital, analog, mixed-signal and radio frequency signal processing functions all on one microchip.

Prof Bogdan Staszewski. Image: UCD

As an aside, theres a lot of interest in radio frequency integrated circuits (RFIC) research now because it offers a huge cost benefit for system-on-chip solutions and this will only grow along with the pervasiveness of wireless capabilities in electronics.

Success in this research knows no pinnacle, it is just constantly evolving. We started with 1G and 2G wireless communication. Then came 3G and 4G. Nowadays the carriers are installing 5G networks, but researchers are working on 6G even though there is no agreement about what it will be. Thats the journey that makes us all excited.

The focus will remain on reducing power consumption and increasing performance, so that we can move towards IoT network applications that can perform more and more complex tasks. Power and capacity are key.

The need to economise power consumption is well understood, for a variety of practical, environmental and socio-economic reasons. Data, however, is a less familiar commodity in our world, in spite of the volume we generate on a daily basis, almost universally. And IoT is also greatly accelerating the demands for bandwidth in our networks, which in turn creates issues around equality of access and the enabling of future technology.

At IoE2, were looking at the problem of so many wireless devices coexisting in extremely congested networks, and the solution is cooperative wireless.

Like physicists and geneticists, electrical and electronics engineers look for answers in ever smaller parts inside our networks

Cooperative networks are at the foundation of IoT. At the system level, this means algorithms, components and software needed to make them energy and bandwidth-efficient. But at the physical layer beneath, we need hugely flexible nodes that can operate in an intelligent and cooperative manner.

To put this in context, a single ant cannot possibly do anything useful but the whole colony of ants are physically able to lift an elephant if they work in collaboration. Even a simple IoT node can do wonders if connected to a large network.

For instance, Swarms constellation of nanosatellites has helped harness the potential of IoT networks and their thousands of devices and billions of bits of data. Each nanosatellite is small and rather dumb but, in collaboration with others, they can execute quite sophisticated tasks and at a fraction of the cost of existing networks linked to broadband internet satellites.

Of course, enhancing capacity and enabling technology also requires enhanced security, especially as our networks become capable of storing more and more data.

We have found ways to increase security at the sub-system level, by creating tamper-proof ROM (read-only memory) and microchips that cannot be reverse engineered. We make increasingly sophisticated chips and memory that are perfected to be error-free and operable throughout their lifetime without updates or patches.

But the journey to advance and secure our networks has passed beyond the world of microelectronics, into the quantum world a world of the sub-atomically small. It would be fair to say this is the next real game-changer for ICT and will even surpass the invention of the integrated circuit itself.

While quantum computing will probably remain aloof from most people, the technology arising from its development will have major implications for society and for the evolution of communications and future networks.

The eventual growing use of quantum computing will render normal encryption virtually useless, creating the need for a global rewrite of our networks security

In simple terms, by exploiting quantum mechanics, a quantum computer takes mere seconds or minutes to crack an algorithm that classical computers would take lifetimes to crack. The power of this technology is transformational. It underpins the only form of communication that is provably unhackable and uninterceptable, heralding a new age of data security.

However, the development of quantum technologies will drive quantum communication and destabilise traditional networks. While only the military and proverbial Swiss banks have the need of these super secure communications for now, the eventual growing use of quantum computing will render normal encryption virtually useless, creating the need for a global rewrite of our networks security.

This technology is only a few years away. And even though the major hype of research remains on quantum computing rather than its application in other fields such as communications, its arrival will profoundly change the world as we know it.

Until then, all the possibilities of our future networks will rely on us building upon current technologies to make the communication pipe bigger and cheaper making our networks better, faster, with less power.

By Prof Bogdan Staszewski

Prof Bogdan Staszewski is a professor of electronic circuits at the UCD School of Electrical and Electronic Engineering and Delft University of Technology in the Netherlands. He is part of the IoE2 Lab within the UCD Centre for Internet of Things Engineering and co-founder of Equal1 Labs, conducting research to build the worlds first practical single-chip CMOS quantum computer.

View post:

Faster, better, stronger: The next stage of global communications networks - Siliconrepublic.com