World High Performance Computing (HPC) Markets to 2025 – AI, IoT, and 5G will be Major Drivers for HPC Growth as they Facilitate the Need to Process…

DUBLIN, Jan. 9, 2020 /PRNewswire/ -- The "High Performance Computing (HPC) Market by Component, Infrastructure, Services, Price Band, HPC Applications, Deployment Types, Industry Verticals, and Regions 2020-2025" report has been added to ResearchAndMarkets.com's offering.

This report evaluates the HPC market including companies, solutions, use cases, and applications. Analysis includes HPC by organizational size, software and system type, server type, and price band, and industry verticals. The report also assesses the market for integration of various artificial intelligence technologies in HPC. It also evaluates the exascale-level HPC market including analysis by component, hardware type, service type, and industry vertical.

High Performance Computing (HPC) may be provided via a supercomputer or via parallel processing techniques such as leveraging clusters of computers to aggregate computing power. HPC is well-suited for applications that require high performance data computation such as certain financial services, simulations, and various R&D initiatives.

The market is currently dominated on the demand side by large corporations, universities, and government institutions by way of capabilities that are often used to solve very specific problems for large institutions. Examples include financial services organizations, government R&D facilities, universities research, etc.

However, the cloud-computing based as a Service model allows HPC market offerings to be extended via HPC-as-a-Service (HPCaaS) to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems. Industry use cases are increasingly emerging that benefit from HPC-level computing, many of which benefit from split processing between localized device/platform and HPCaaS.

In fact, HPCaaS is poised to become much more commonly available, partially due to new on-demand supercomputer service offerings, and in part as a result of emerging AI-based tools for engineers. Accordingly, up to 45% of revenue will be directly attributable to the cloud-based business model via HPCaaS, which makes High-Performance Computing solutions available to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems.

In a recent study, we conducted interviews with major players in the market as well as smaller, lesser known companies that are believed to be influential in terms of innovative solutions that are likely to drive adoption and usage of both cluster-based HPC and supercomputing.

In an effort to identify growth opportunities for the HPC market, we investigated market gaps including unserved and underserved markets and submarkets. The research and advisory firm uncovered a market situation in which HPC currently suffers from an accessibility problem as well as inefficiencies and supercomputer skill gaps.

Stated differently, the market for HPC as a Service (e.g. access to high-performance computing services) currently suffers from problems related to the utilization, scheduling, and set-up time to run jobs on a supercomputer. We identified start-ups and small companies working to solve these problems.

One of the challenge areas identified is low utilization but (ironically) also high wait times for most supercomputers. Scheduling can be a challenge in terms of workload time estimation. About 20% of jobs are computationally heavy 30% of jobs cannot be defined very well in terms of how long jobs will take (within 3-minute window at best). In many instances, users request substantive resources and don't actually use computing time.

In addition to the scheduling challenge, we also identified a company focused on solving additional problems such as computational planning and engineering. We spoke with the principal of a little-known company called Microsurgeonbot, Inc. (doing business as MSB.ai), which is developing a tool for setting up computing jobs for supercomputers.

The company is working to solve major obstacles in accessibility and usability for HPC resources. The company focuses on solving a very important problem in HPC: Supercomputer job set-up and skills gap. Their solution known as "Guru" is poised to make supercomputing much more accessible, especially to engineers in small to medium-sized businesses that do not have the same resources or expertise as large corporate entities.

Key Topics Covered

1 Executive Summary1.1 Companies in Report1.2 Target Audience1.3 Methodology

2 Introduction2.1 Next Generation Computing2.2 High Performance Computing2.2.1 HPC Technology2.2.1.1 Supercomputers2.2.1.2 Computer Clustering2.2.2 Exascale Computation2.2.2.1 United States2.2.2.2 China2.2.2.3 Europe2.2.2.4 Japan2.2.2.5 India2.2.2.6 Taiwan2.2.3 High Performance Technical Computing2.2.4 Market Segmentation Considerations2.2.4.1 Government, NGOs, and Universities2.2.4.2 Small Companies and Middle Market2.2.5 Use Cases and Application Areas2.2.5.1 Computer Aided Engineering2.2.5.2 Government2.2.5.3 Financial Services2.2.5.4 Education and Research2.2.5.5 Manufacturing2.2.5.6 Media and Entertainment2.2.5.7 Electronic Design Automation2.2.5.8 Bio-Sciences and Healthcare2.2.5.9 Energy Management and Utilities2.2.5.10 Earth Science2.2.6 Regulatory Framework2.2.7 Value Chain Analysis2.2.8 AI to Drive HPC Performance and Adoption

3 High Performance Computing Market Analysis and Forecast 2020-20253.1 Global High Performance Computing Market 2020-20253.1.1 Total High Performance Computing Market 2020-20253.1.2 High Performance Computing Market by Component 2020-20253.1.2.1 High Performance Computing Market by Hardware and Infrastructure Type 2020-20253.1.2.1.1 High Performance Computing Market by Server Type 2020-20253.1.2.2 High Performance Computing Market by Software and System Type 2020-20253.1.2.3 High Performance Computing Market by Professional Service Type 2020-20253.1.3 High Performance Computing Market by Deployment Type 2020-20253.1.4 High Performance Computing Market by Organization Size 2020-20253.1.5 High Performance Computing Market by Server Price Band 2020-20253.1.6 High Performance Computing Market by Application Type 2020-20253.1.6.1 High Performance Technical Computing Market by Industry Vertical 2020-20253.1.6.2 Critical High Performance Business Computing Market by Industry Vertical 2020-20253.1.1 High Performance Computing Deployment Options: Supercomputer vs. Clustering 2020-20253.1.2 High Performance Computing as a Service (HPCaaS) 2020-20253.1.3 AI Powered High Performance Computing Market3.1.3.1 AI Powered High Performance Computing Market by Component3.1.3.2 AI Powered High Performance Computing Market by AI Technology3.2 Regional High Performance Computing Market 2020-20253.3 Exascale Computing Market 2020-20253.3.1 Exascale Computing Driven HPC Market by Component 2020-20253.3.2 Exascale Computing Driven HPC Market by Hardware Type 2020-20253.3.3 Exascale Computing Driven HPC Market by Service Type 2020-20253.3.4 Exascale Computing Driven HPC Market by Industry Vertical 2020-20253.3.1 Exascale Computing as a Service 2020-2025

4 High Performance Computing Company Analysis4.1 HPC Vendor Ecosystem4.2 Leading HPC Companies4.2.1 Amazon Web Services Inc.4.2.2 Atos SE4.2.3 Adavnced Micro Devices Inc.4.2.4 Cisco Systems4.2.5 DELL Technologies Inc.4.2.6 Fujitsu Ltd.4.2.7 Hewlett Packard Enterprise (HPE)4.2.8 IBM Corporation4.2.9 Intel Corporation4.2.10 Microsoft Corporation4.2.11 NEC Corporation4.2.12 NVIDIA4.2.13 Rackspace Inc.4.1 Companies to Watch4.1.1 Braket Inc.4.1.1 MicroSurgeonBot Inc. (MSB.ai)

5 Conclusions and Recommendations5.1 AI to Support Adoption and Usage of HPC5.2 5G and 6G to Drive Increased Demand for HPC

6 Appendix: Future of Computing6.1 Quantum Computing6.1.1 Quantum Computing Technology6.1.2 Quantum Computing Considerations6.1.3 Market Challenges and Opportunities6.1.4 Recent Developments6.1.5 Quantum Computing Value Chain6.1.6 Quantum Computing Applications6.1.7 Competitive Landscape6.1.8 Government Investment in Quantum Computing6.1.9 Quantum Computing Stakeholders by Country6.1 Other Future Computing Technologies6.1.1 Swarm Computing6.1.2 Neuromorphic Computing6.1.3 Biocomputing6.2 Market Drivers for Future Computing Technologies6.2.1 Efficient Computation and High Speed Storage6.2.2 Government and Private Initiatives6.2.3 Flexible Computing6.2.4 AI-enabled, High Performance Embedded Devices, Chipsets, and ICs6.2.5 Cost Effective Computing powered by Pay-as-you-go Model6.3 Future Computing Market Challenges6.3.1 Data Security Concerns in Virtualized and Distributed Cloud6.3.2 Funding Constrains R&D Activities6.3.3 Lack of Skilled Professionals across the Sector6.3.4 Absence of Uniformity among NGC Branches including Data Format

For more information about this report visit https://www.researchandmarkets.com/r/xa4mit

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Link:
World High Performance Computing (HPC) Markets to 2025 - AI, IoT, and 5G will be Major Drivers for HPC Growth as they Facilitate the Need to Process...

IBM Becomes the Next Big Threat to Crypto after Google – CryptoVibes

Tech giant Google announced a few days ago that it had reached Quantum Supremacy. Now, another tech heavyweight IBM, is announcing its lofty quantum ambitions that could be detrimental to cryptocurrencies.

At the CES 2020 conference yesterday, IBM announced that it is using its 28-qubit quantum computer called Raleigh to achieve a Quantum Volume of 32. While it is not a very significant number as far as breaking the crypto code is concerned, it is important to note that IBM is doubling its volume every year.

Quantum Volume is a number used to describe the level of complexity of problems that a quantum computer can solve. A higher Quantum Volume means a more powerful computer. While the world keeps talking about AI, cryptocurrencies, blockchain, IoT and other emerging technologies, it is quantum computing that could become the most important innovation of this century. It has the ability to touch almost every industry and walk of life and can impact other emerging technologies significantly.

The first of practical quantum computers were introduced by Jonathan Home in 2009, but since then, tech giants like IBM and Google have taken the lead to create the next generation of powerful computing systems. For long, Bitcoin has been considered vulnerable to the attack of quantum computers. Therefore, Google and IBMs developments could pose a significant threat to the existence of the crypto sector.

Authors of a June 2017 paper on cryptography suggest that a quantum computer with the processing power of 2,500 qubits will be powerful enough to break the 256-bit encryption used on the Bitcoin blockchain. The most powerful quantum computer today holds only a fraction of that processing power, i.e. 72-qubit.

Crypto godfather David Chaum has already started warning the community to brace for impact and start working on an answer to Google and IBMs quantum powers right now. While their processing powers look inconsequential right now, the day may not be far when they can actually start creating ripples in the crypto community.

Go here to see the original:
IBM Becomes the Next Big Threat to Crypto after Google - CryptoVibes

CES 2020: IBM and Daimler teamed up to make a quantum leap in battery tech – Roadshow – CNET

Sure, it looks like a very fancy chandelier, but it's actually a quantum computer and it's helping Daimler develop new EV battery chemistries.

Right now, we're living in a time where electric cars are really, genuinely good. They have long range capability, can charge in reasonable amounts of time, and are being marketed by automakers as serious vehicles, not novelties or something to sell only to stay in compliance with government regulations.

Still, genuinely good isn't good enough. Thus, people are looking for ways to improve the EV experience. Motors are already superpowerful and relatively efficient, so the next meaningful jump forward will likely come on the energy storage side of things, and many companies are banking on that jump being in the form of solid-state batteries.

Why solid-state batteries? Because, in theory, they will be lighter and more compact, more energy-dense and faster charging. Oh, and they'll likely be safer too with less of a possibility of a dangerous thermal runaway like lithium-ion. Only, here's the thing: they don't really exist yet.

Enter IBM (yes, that IBM), which at the CES 2020 show in Las Vegas on Tuesday announced that it had partnered with Daimler to leverage its considerable resources and research into quantum computing to help lick this solid-state battery problem once and for all.

How exactly are quantum computers helping to solve the complex problems that will lead to solid-state battery technology? Well, as the patron saint of grumpy people who swear a lot (Samuel L. Jackson) said in Jurassic Park, "Hold onto your butts."

In the most basic sense, the quantum computers from IBM have modeled the behavior of three different lithium-containing molecules. This, in turn, allows researchers to better understand how they will affect the energy storage and discharge properties that manufacturers are looking for in batteries. Specifically, simulating these molecules will enable scientists to find their "ground state" or most stable configuration.

This simulation of simple molecules is possible on traditional supercomputers, but it takes vast amounts of computing power and time, and as the molecules being simulated get more complex, the likelihood of errors gets bigger. Quantum computing gets around this by using the ideas of superposition (think Schrodinger's cat) and entanglement (aka Einstein's "spooky action at a distance") to much more efficiently evaluate much, much more data than a traditional computer.

Right now, the most promising of these new quantum computer-assisted potential battery chemistries -- according to IBM and Daimler, of course -- is lithium-sulfur. According to the research, lithium-sulfur batteries would be more powerful, longer-lasting and cheaper (the battery holy trinity) than today's lithium-ion cells.

So does this mean that we'll be seeing electric Benzes rolling around with sweet new lithium-sulfur batteries in the next year or two? Not really -- currently, neither company has offered an ETA on the tech -- but what it does mean is that researchers now have a leg up on developing the future of energy storage, and that's pretty damned cool.

Now playing: Watch this: Quantum computing is the new super supercomputer

4:11

See original here:
CES 2020: IBM and Daimler teamed up to make a quantum leap in battery tech - Roadshow - CNET

Podcast: The Overhype and Underestimation of Quantum Computing – insideHPC

https://radiofreehpc.com/audio/RF-HPC_Episodes/Episode260/RFHPC260_QuantumQuantum.mp3In this podcast, the Radio Free HPC team looks at how Quantum Computing is overhyped and underestimated at the same time.

The episode starts out with Henry being cranky. It also ends with Henry being cranky. But between those two events, we discuss quantum computing and Shahins trip to the Q2B quantum computing conference in San Jose.

Not surprisingly, there is a lot of activity in quantum, with nearly every country pushing the envelop outward. One of the big concerns is that existing cryptography is now vulnerable to quantum cracking. Shahin assures us that this isnt the case today and is probably a decade away, which is another way of saying nobody knows, so it could be next week, but probably not.

We also learn the term NISQ which is a descriptive acronym for the current state of quantum systems. NISQ stands for Noisy Intermediate Scale Quantum computing. The conversation touches on various ways quantum computing is used now and where its heading, plus the main reason why everyone seems to be kicking the tires on quantum: the fear of missing out. Its a very exciting area, but to Shahin, it seems like how AI was maybe 8-10 years ago, so still early days.

Other highlights:

Download the MP3 *Follow RFHPC on Twitter *Subscribe on Spotify *Subscribe on Google Play *Subscribe on iTunes

Sign up for the insideHPC Newsletter

More:
Podcast: The Overhype and Underestimation of Quantum Computing - insideHPC

Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing – NewsClick

Image Courtesy: Smithsonian Magazine. Image depicts some of the skull caps excavated from Ngandong.

In development of science, what should matter the most is the findings that help the humanity, the findings that have the potential to open up new paradigms or those which change our understanding of the past or open our eyes to the future. The year 2019 also witnessed several such findings in the science world.

HUMAN HISTORY THROUGH GENETICS

Tracing human history has been achieved with the realm of genetics research as well. Year 2019 also witnessed some of the breakthroughs about human history based on analysis done on ancient DNA found on fossils and other sources.

One of such important findings has come up with a claim about the origin of modern human. What it says is that anatomically, modern humans first appeared in Southern part of Africa. A wetland that covered present day Botswana, Namibia and Zimbabwe was where the first humans lived some 200,000 years ago. Eventually, humans migrated out of this region. How was the study conducted? Researchers gathered blood samples from 200 living people in groups whose DNA is poorly known, including foragers and hunter-gatherers in Namibia and South Africa. The authors analyzed the mitochondrial DNA (mtDNA), a type of DNA inherited only from mothers, and compared it to mtDNA in databases from more than 1000 other Africans, mostly from southern Africa. Then the researchers sorted how all the samples were related to each other on a family tree. The data reveals that one mtDNA lineage in the Khoisan speakersL0is the oldest known mtDNA lineage in living people. The work also tightens the date of origin of L0 to about 200,000 years ago

Another very important and interesting finding in this field is that Homo Erectus, the closest ancestor of modern humans, marked its last presence on the island of Java, Indonesia. The team of scientists has estimated that the species existed in a place known as Ngandong near the Solo riverbased on dating of animal fossils from a bone bed where Homo Erectus skull caps and leg bones were found earlier. Scientists used to believe that Homo Erectus migrated out of Africa, into Asia, some two million years back. They also believed that the early human ancestor became extinct from the earth around 4 lakh years ago. But the new findings indicate that the species continued to exist in Ngandong even about 117,000 to 108,000 years ago.

So far, anything that is known about the Denisovans, the mysterious archaic human species, was confined to the Denisova caves in Altai Mountain in Siberia. Because the remnants of this ancient species could be discovered in the fossils of the Denisova cave only. But a recent report published in Nature about the discovery of a Denisovan jawbone in a cave in the Tibetan Plateau has revealed many interesting facts about archaic humans. The fossil has been found to be 1,60,000 years old with a powerful jaw and unusually large teeth, resembling the most primitive Neanderthals. Protein analysis of the fossil revealed that they are closer to the Siberian Denisovans.

Image Courtesy: dawn.com

QUANTUM COMPUTING AND SUPREMACY:

Image Courtesy: Quantum magazine.

Computer scientists nowadays are concentrating on going far beyond the speed that the present genre of computing can achieve. Now the principles of quantum mechanics are being tried to incorporate into the next-generation computing. There have been some advances, but the issue in this realm that has sparked controversies is Googles claim to have obtained quantum supremacy.

Sycamore, Googles 53-qubit computer has solved a problem in 200 seconds which would have taken even a supercomputer 10,000 years. In fact, it is a first step. It has shown that a quantum computer can do a functional computation and that quantum computing does indeed solve a special class of problems much faster than conventional computers.

On the other hand, IBM researchers have countered saying that Google hadnt done anything special. This clash indeed highlights the intense commercial interest in quantum computing.

NATURE, CLIMATE AND AMAZON FOREST

Image Courtesy: NASA Earth Observatory.

The man-made climate change has already reached a critical state. Climate researches have already shown how crossing the critical state would bring irreversible changes to the global climate and an accompanying disaster for humanity.

In the year 2019 also, the world has witnessed many devastations in the forms of storms, floods and wildfires.

Apart from the extreme weather events that climate change is prodding, the nature itself is in the most perilous state ever, and the reason is human-made environmental destruction.

The global report submitted by Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) reviewed some 15,000 scientific papers and also researched other sources of data on trends in biodiversity and its ability to provide people everything from food and fiber to clean water and air.

The report notes that out of 8 million known species of animals and plants, almost 1 million are under the threat of getting extinct and this includes more than 40% of amphibian species and almost a third of marine mammals.

The month of August witnessed an unprecedented wildfire in Amazon rainforest, the biggest in the world. The fire was so large-scale that the smoke covered nearby cities with dark clouds. It has been reported that Brazils National Institute for Space Research (INPE) recorded over 72,000 fires this year, which is an increase of about 80% from last year. More worrisome is the fact that more than 9,000 of these fires have taken place in the last week alone.

The fires have engulfed several large Amazon states in Northwestern Brazil. NASA, on August 11 noted that the fires were huge enough to be spotted from the space.

The main reason attributable to Amazon fires is widescale deforestation due to policy-level changes made by Bolsonaro regime. Many parts of the forest are now made open for the companies to set up business ventureseven the deeper parts of the forest. This has led to massive deforestation.

NEW DIMENSION TO THE TREATMENT OF EBOLA

Image Courtesy: UN News.

In the past, there had been no drugs that could have cured Ebola.

However, two out of four experimental trials carried out in Democratic Republic of Congo were found to be highly effective in saving patients lives. The new treatment method used a combination of existing drugs and newly developed ones. Named as PALM trial, the new method uses monoclonal antibodies and antiviral agencies.

Monoclonal antibodies are antibodies that are made by identical immune cells that are all clones of a unique parent cell. The monoclonal antibodies bind to specific cells or proteins. The objective is that this treatment will stimulate the patients immune system to attack those cells.

KILOGRAM REDEFINED

Image courtesy: phys.org

Kilogram, the unit to measure mass was defined by a hunk of metal in France. This hunk of metal, also known as the International Prototype Kilogram or Big K, is a platinum-iridium alloy having a mass of 1 kilogram housed at the Bureau of Weights and Measures in France since 1889. The IPK has many copies around the world and are used to calibrate scales to make sure that the whole world follows a standard system of measurement.

But the definition of the Kilogram will no longer be the same. On the International Metrology Day this year, the way a Kilogram has been measured for more than a century has been changed completely. Now, the kilogram would be defined using the Planck constant, something that does not change.

Follow this link:
Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing - NewsClick

Perspective: End Of An Era | WNIJ and WNIU – WNIJ and WNIU

David Gunkel's "Perspective" (January 8, 2020).

The holiday shopping is over and everyone is busy playing with their new toys. But what was remarkable about Christmas 2019 might have been the conspicuous absence of such toys.

Previous holiday seasons saw the introduction of impressive technological wonders -- tablet computers, the iPhone, Nintendo Wii and the X-box. But this year, there was no stand-out, got-to-have technological object.

On the one hand, this may actually be a good thing. The amount of waste generated by discarded consumer electronics is a massive global problem that we are not even close to managing responsibly. On the other hand however, this may be an indication of the beginning of the end of an era -- the era of Moores Law.

In 1965, Gordon Moore, then CEO of Intel, predicted that the number of transistors on a microchip doubles every two years, meaning that computer chip performance would develop at an almost exponential rate. But even Moore knew there was a physical limit to this dramatic escalation in computer power, and we are beginning to see it top out. That may be one reason why there were no new, got-to-have technological gizmos and gadgets this holiday season.

Sure, quantum computing is already being positioned as the next big thing. But it will be years, if not decades, before it finds its way into consumer products. So for now, do not ask Santa to fill your stocking with a brand-new quantum device. It will, for now at least, continue to be lumps of increasingly disappointing silicon.

Im David Gunkel, and thats my perspective.

See the original post:
Perspective: End Of An Era | WNIJ and WNIU - WNIJ and WNIU

Tech Pros Share Their Predictions For 2020’s Most Impactful Tech Trends – Forbes

Tech changes fasttrends can spring up almost without warning, quickly sweeping everyone up in a viral frenzy. Not every development has staying power, though. Businesses and consumers are always looking for inside information on whats to comeand what will have a lasting impactin the tech world.

Below, 14 experts from Forbes Technology Council outline their picks for tech trends that will emerge in 2020 and have along-termeffecton how we live and work.

1. 5G Applications

As the cloud gets further 5G and edge computing capabilities, 2020 will be the first year when the developer community will get meaningful access to 5G features within leading cloud platforms. Such access will enable the creation of new mobile applications that can benefit from low latencyas low as a few milliseconds. Areas worth mentioning include live gaming, digital payments and access control. - Ahmad (Al) Fares, Celitech Inc.

2. Quantum AI

The real-world impact of quantum computing will still be in its infancy, but we will start to see where it will be impactful. Real-world applications will not be seen, but companies at the tip of the spear will begin to discuss how this will impact their business. - Jos Morey, Liberty BioSecurity

3. Edge Computing

Traditional cloud computing architecture is centralized, which makes it more vulnerable to attacks. Edge computing distributes processing, storage and applications across devices, thus making a security breach more difficult. In more regulated regionssuch as Europe, with its General Data Protection Regulationbusinesses can meet regulatory requirements by securing personal data on the edge. - Christopher Yang, Corporate Travel Management

4. Threat Modeling

Threat modeling will become more necessary as organizations transition to a DevSecOps approach. Threat modeling enables organizations to identify security threats as far left as possiblee.g., during the systems development life cycle planning stages. Automated tools will introduce increased time-cost savings on threat modeling while enabling organizations to scale more efficiently. - Archie Agarwal, ThreatModeler Software, Inc.

5. Human Augmentation

Human augmentation refers to the use of technology to enhance a persons physical or mental capabilities. Were already beginning to see more human augmentation products on the market, and I can only imagine well see more of it in 2020 and beyond. For example, there are products like eSight, which is a wearable device like glasses that enables legally blind individuals to see their environment. - Thomas Griffin, OptinMonster

6. Facial Recognition Payment

I believe the most impactful tech trend coming in 2020 is the use of facial recognition payment. We already see it in China, and it is making its way here. It will absolutely diminish the need for cash and bank cards. - Elaine Montilla, CUNY Graduate Center

7. Identity As A New Perimeter In Cloud Security

In 2020, security professionals will realize that identity and access management (IAM) is an area they can quickly lose control of in the cloud due to the rapid rate of changeand the repercussions of doing so are substantial. Strategies from the data center world dont transfer well, and companies will need to invest in the proper supporting tools to stay ahead in that complex landscape. - Chris Deramus, DivvyCloud

8. Businesses Taking Data Privacy Seriously

One of the most powerful shifts well see in 2020 is businesses and organizations taking data privacy and entitlements seriously. With GDPR maturing, the California Consumer Privacy Act going into effect and regulators paying special attention, businesses ignore these at their own peril. Leveraging customer data across multiple systems is imperative for digital transformations, and this just added constraints. - Ganesh Padmanabhan, Molecula Corp.

9. AI-Driven Personalization

As consumers, we increasingly want and expect experiences that speak directly to us in our current situation. Well-applied AI can now enable the one-on-one mass personalization that we have been talking about for years in our digital experiences. We can now adjust, in real time, to the needs and behaviors of our prospects and customers to meet them where they are in their journey with us. - Guy Yalif, Intellimize

10. Blockchain Finding More Use Cases

I believe blockchain will continue to find new use cases in 2020 that will expand on trusted identity management, documents and business-to-government, business-to-consumer and business-to-business information exchange. Were seeing all the cloud providers standing up and maturing blockchain platform as a service to enable these solutions to be developed in this coming year and beyond. - David Torres, Feedme Inc.

11. Industrial Internet Of Things

IIoT is changing the ways in which maintenance professionals perform their work with more data-driven insights. Were seeing more people using technology to gather data on their assets and equipment, which allows technicians in the field to proactively predict or prevent errors before they happen, as opposed to firefighting problems as they occur in the facility. - Ryan Chan, UpKeep Maintenance Management

12. Rise Of Kubernetes

Okay, no one is actually going to run Kubernetes on their desktop, but this is going to be the cloud technology that makes as big a wave as Linux did for servers. Were right at the very beginning of this revolution, and 2020 is the year the biggest companies are going to adopt what has become a default technology for SMEs. - Kendall Miller, Fairwinds Ops, Inc.

13. Growing Importance Of Automotive Software

The automotive industry is about to be turned on its head. The days of exclusively buying for the size of a vehicle or efficiency are gone. Now its a question of Apple CarPlay, Android Auto or something unique and innovative that Tesla is doing. We now live in a connected world; the value, in addition to efficiency and size, is the lastability and enjoyability of a car. Thats software. - WaiJe Coler, InfoTracer

14. Psychographic Targeting

Psychographic targeting will have a profound impact on the outcomes of the 2020 presidential elections. Extracting peoples personality traits; learning their attitudes, interests and motivations; and blending them into appropriate messaging through streamlined ad serving pipelines will greatly change the way public discourse is being shaped and maintained. - Pawel Rzeszucinski, Codewise

Visit link:
Tech Pros Share Their Predictions For 2020's Most Impactful Tech Trends - Forbes

AI, edge computing among Austin tech trends to watch in 2020 – KXAN.com

AUSTIN (KXAN) Technology companies in Austin will continue to integrate tech into the physical world in 2020, making the city smarter and more connected, analysts say.

The Austin Forum on Technology and Society will dive into the top tech trends for the coming year at its first event of 2020 Tuesday night at the Austin Central Library downtown.

Well talk about both technologies that will really become mainstream next year, even more so than now, and others that the buzz will continue, but maybe theyre not ready to become mainstream, said Jay Boisseau, the Forums founder and executive director.

Boisseau gave KXAN a preview of some of the trends to watch, including artificial intelligence, edge computing and quantum computing.

A lot of companies are already deeply involved with AI, but Boisseau believes it will move into more real-world applications this year.

Companies like SparkCognition and and Valkyrie Intelligence are already experimenting in the AI space in Austin.

KXAN profiled Valkyrie last year. The company developed a way to identify and track cars on Austin roads and hoped the technology would have an application with Army Futures Command.

Austin is also primed to capitalize on advances in edge computing, Boisseau said.

Not all computing will be in data centers and clouds, but much of it will start to move out to the real world where the data actually occurs, where things happen, he explained.

More of the physical world will be equipped with sensors and data processors that can act on data in real time as they get it.

This is especially important for self-driving cars, he said, allowing vehicles to communicate with their environment to keep people safe inside and outside the car.

Ford announced last year Austin will serve as a test market for its self-driving vehicles. The car company plans to map out the citys roads this year.

Standard computing uses binary code 0s and 1s to process data, imposing limits on the amount of processing power traditional computers can generate.

Quantum computing exploits characteristics of atoms and other tiny particles to vastly expand the abilities of processors, allowing researchers to tackle problems in fields like medicine that computers currently cant.

Well hear a lot of buzz about that, Boisseau said, even though its probably going to be three to five years before we see a lot of business adoption of quantum computing.

The Austin Forum on Technology and Society will dig into those topics and others more deeply at Tuesdays event, Top Tech Trends for 2020 (And Beyond). It starts at 6:15 p.m. at Austins Central Library.

See the original post here:
AI, edge computing among Austin tech trends to watch in 2020 - KXAN.com

Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s – The Daily Hodl

The new decade will unfurl a bag of seismic shifts, predicts the creator of Cardano and Ethereum, Charles Hoskinson. And these changes will propel cryptocurrency and blockchain solutions to the forefront as legacy systems buckle, transform or dissolve.

In an ask-me-anything session uploaded on January 3rd, the 11th birthday of Bitcoin, Hoskinson acknowledges how the popular cryptocurrency gave him an eye-opening introduction to the world of global finance, and he recounts how dramatically official attitudes and perceptions have changed.

Every central bank in the world is aware of cryptocurrencies and some are even taking positions in cryptocurrencies. Theres really never been a time in human history where one piece of technology has obtained such enormous global relevance without any central coordinated effort, any central coordinated marketing. No company controls it and the revolution is just getting started.

And he expects its emergence to coalesce with other epic changes. In a big picture reveal, Hoskinson plots some of the major events he believes will shape the new decade.

2020 Predictions

Hoskinson says the consequences of these technologies will reach every government service and that cryptocurrencies will gain an opening once another economic collapse similar to 2008 shakes the markets this decade.

I think that means its a great opening for cryptocurrencies to be ready to start taking over the global economy.

Hoskinson adds that hes happy to be alive to witness all of the changes he anticipates, including a reorganization of the media.

This is the last decade of traditional organized media, in my view. Were probably going to have less CNNs and Fox Newses and Bloombergs and Wall Street Journals and more Joe Rogans, especially as we enter the 2025s and beyond. And I think our space in particular is going to fundamentally change the incentives of journalism. And well actually move to a different way of paying for content, curating content.

Check Latest News Headlines

Featured Image: Shutterstock/Liu zishan

Read more:
Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s - The Daily Hodl

Inside the race to quantum-proof our vital infrastructure – www.computing.co.uk

"We were on the verge of giving up a few years ago because people were not interested in quantum at the time. Our name became a joke," said Andersen Cheng, CEO of the UK cybersecurity firm Post-Quantum. After all, he continued, how can you be post- something that hasn't happened yet?

But with billions of pounds, renminbi, euros and dollars (US, Canadian and Australian) being pumped into the development of quantum computers by both governments and the private sector and with that research starting to bear fruit, exemplified by Google's achievement of quantum supremacy, no-one's laughing now.

One day, perhaps quite soon, the tried and trusted public-key cryptography algorithms that protect internet traffic will be rendered obsolete. Overnight, a state in possession of a workable quantum computer could start cracking open its stockpiles of encrypted secrets harvested over the years from rival nations. Billions of private conversations and passwords would be laid bare and critical national infrastructure around the world would be open to attack.

A situation often compared with the Y2K problem, the impact could be disastrous. Like Y2K, no-one can be quite sure what the exact consequences will be; unlike Y2k the timing is unclear. But with possible scenarios ranging from massive database hacks to unstoppable cyberattacks on the military, transport systems, power generation and health services, clearly, this is a risk not to be taken lightly.

Critical infrastructure including power generation would be vulnerable to quantum computers

Post-quantum cryptography uses mathematical theory and computer science to devise algorithms that are as hard to crack as possible, even when faced with the massive parallel processing power of a quantum computer. However, such algorithms must also be easy to deploy and use or they will not gain traction.

In 2016, the US National Institute of Standards and Technology (NIST) launched its competition for Public-Key Post-Quantum Cryptographic Algorithms, with the aim of arriving at quantum-safe standards across six categories by 2024. The successful candidates will supplement or replace the three standards considered most vulnerable to quantum attack: FIPS 186-4 (digital signatures), plusNIST SP 800-56AandNIST SP 800-56B (public-key cryptography).

Not all types of cryptography are threatened by quantum computers. Symmetric algorithms (where the same key is used for encryption and decryption) such as AES, which are often deployed to protect data at rest, and hashing algorithms like SHA, used to prove the integrity of files, should be immune to the quantum menace, although they will eventually need larger keys to withstand increases in classical computing power. But the asymmetric cryptosystems like RSA and elliptic curve cryptography (ECC) which form the backbone of secure communications are certainly in danger.

Asymmetric cryptography and public-key infrastructure (PKI) address the problem of how parties can exchange encryption keys where there's a chance that an eavesdropper could intercept and use them. Two keys (a keypair) are generated at the same time: a public key for encrypting data and a private key for decrypting it. These keys are related by a mathematical function that's trivial to perform one in one direction (as when generating the keys) but very difficult in the other (trying to derive the private key from the corresponding public key). One example of such a 'one-way' function is factorising very large integers into primes. This is used in the ubiquitous RSA algorithms that form the basis of the secure internet protocols SSL and TLS. Another such function, deriving the relationship between points on a mathematical elliptic curve, forms the basis of ECC which is sometimes used in place of RSA where short keys and reduced load on the CPU are required, as in IoT and mobile devices.

It is no exaggeration to say that in the absence of SSL and TLS the modern web with its ecommerce and secure messaging could not exist. These protocols allow data to be transmitted securely between email correspondents and between customers and their banks with all the encryption and decryption happening smoothly and seamlessly in the background. Unfortunately, though, factorising large integers and breaking ECC will be a simple challenge for a quantum computer. Such a device running something like Shor's algorithm will allow an attacker to decrypt data locked with RSA-2048 in minutes or hours rather than the billions of years theoretically required by a classical computer to do the same. This explains NIST's urgency in seeking alternatives that are both quantum-proof and flexible enough to replace RSA and ECC.

NIST is not the only organisation trying to get to grips with the issue. The private sector has been involved too. Since 2016 Google has been investigating post-quantum cryptography in the Chrome browser using NewHope, one of the NIST candidates. Last year Cloudflare announced it was collaborating with Google in evaluating the performance of promising key-exchange algorithms in the real world on actual users' devices.

Of the original 69 algorithms submitted to NIST in 2016, 26 have made it through the vetting process as candidates for replacing the endangered protocols; this number includes NewHope in the Lattice-based' category.

One of the seven remaining candidates in the Code-based' category is Post-Quantum's Never-The-Same Key Encapsulation Mechanism (NTS-KEM) which is based on the McEliece cryptosystem. First published in 1978, McEliece never really took off at the time because of the large size of the public and private keys (100kB to several MB). However, it is a known quantity to cryptographers who have had plenty of time to attack it, and it's agreed to be NP-hard' (a mathematical term that in this context translates very roughly as extremely difficult to break in a human timescale - even with a quantum computer'). This is because it introduces randomisation into the ciphertext with error correction codes.

"We actually introduce random errors every time we encrypt the same message," Cheng (pictured) explained. "If I encrypt the letters ABC I might get a ciphertext of 123. And if I encrypt ABC again you'd expect to get 123, right? But we introduce random errors so this time we get 123, next time we get 789."

The error correction codes allow the recipient of the encrypted message to cut out the random noise added to the message when decrypting it, a facility not available to any eavesdropper intercepting the message.

With today's powerful computers McEliece's large key size is much less of an issue than in the past.Indeed, McEliece has some advantages of its own - encryption/decryption is quicker than RSA, for example - but it still faces implementation challenges compared with RSA, particularly for smaller devices. So for the past decade, Cheng's team has been working on making the technology easier to implement. "We have patented some know-how in order to make our platform work smoothly and quickly to shorten the keys to half the size," he said.

Post-Quantum has open-sourced its code (a NIST requirement so that the successful algorithms can be swiftly distributed) and packaged it into libraries to make it as drop-in' as possible and backwards-compatible with existing infrastructure.

Nevertheless, whichever algorithms are chosen, replacing the incumbents like-with-like won't be easy. "RSA is very elegant," Cheng admits. "You can do both encryption and signing. For McEliece and its derivatives because it's so powerful in doing encryption you cannot do signing."

An important concept in quantum resistance is crypto-agility' - the facility to change and upgrade defences as the threat landscape evolves. Historically, industry has been the very opposite of crypto-agile: upgrading US bank ATMs from insecure DES to 3DES took an entire decade to complete. Such leisurely timescales are not an option now that a quantum computer capable of cracking encryption could be just three to five years away.

Because of the wide range of environments, bolstering defences for the quantum age is not as simple as switching crypto libraries. In older infrastructure and applications encryption may be hard-coded, for example. Some banks and power stations still rely on yellowing ranks of servers that they dare not decommission but where the technicians who understand how the encryption works have long since retired. Clearly, more than one approach is needed.

It's worth pointing out that the threat to existing cryptosystems comes not only from quantum computers. The long-term protection afforded by encryption algorithms has often been wildly overestimated even against bog standard' classical supercomputers. RSA 768, introduced in the 1970s, was thought to be safe for 7,000 years, yet it was broken in 2010.

For crypto-agility algorithms need to be swappable

Faced with the arrival of quantum computers and a multiplicity of use cases and environments, cryptographers favour a strength-in-depth or hybridised approach. Cheng uses the analogy of a universal electrical travel plug which can be used in many different counties.

"You can have your RSA, the current protocol, with a PQ [post-quantum] wrapper and make the whole thing almost universal, like a plug with round pins, square pins or a mixture of both. Then when the day comes customers can just turn off RSA and switch over to the chosen PQ algorithm".

Code-based systems like NTS-KEM are not the only type being tested by NIST. The others fall into two main categories: multivariate cryptography, which involves solving complex polynomial equations, and lattice-based cryptography, which is a geometric approach to encrypting data. According to Cheng, the latter offers advantages of adaptability but at the expense of raw encryption power.

"Lattice is less powerful but you can do both encryption and signing,

but it has not been proven to be NP-hard," he said, adding: "In the PQ world everyone's concluded you need to mix-and-match your crypto protocols in order to cover everything."

Professor Alan Woodward (pictured) of Surrey University's Department of Computing said that it's still too early to guess which will ultimately prove successful.

"Lattice-based schemes seem to be winning favour, if you go by numbers still in the race, but there is a lot of work being done on the cryptanalysis and performance issues to whittle it down further," he said. "If I had to bet, I'd say some combination of lattice-based crypto and possibly supersingular isogeny-based schemes will emerge for both encryption and signature schemes."

Quantum mechanics can be an aid in the generation of secure classical encryption keys. Because of their deterministic nature, classical computers cannot generate truly random numbers; instead they produce pseudo-random numbers that are predictable, even if only to a tiny degree. One of Edward Snowden's revelations was that the NSA had cracked the random number generator used by RSA. More recently, weaknesses in RSA's random number generation were discovered in some IoT devices, where one in 172 were found to use the same factor to generate keys. However, a quantum random number generator (QRNG) produces numbers that are truly random, according to quantum theory, resolving this key area of vulnerability.

QKD commonly uses polarised photos to represent ones and zeros

Whereas post-quantum cryptography is based on maths, the other major area of research interest, quantum key distribution (QKD), is rooted in physics, specifically the behaviour of subatomic particles. QKD is concerned with key exchange, using quantum-mechanics to ensure that eavesdroppers cannot intercept the keys without being noticed.

In BB84, the first proposed QKD scheme and still the basis for many implementations, the quantum mechanical properties of subatomic particle, such as the polarity of a photon, is manipulated to represent either a zero or a one. A stream of such photons, polarised at random, is then sent by one party to a detector controlled by the other.

Before they reach the detector, each photon must pass through a filter. One type of filter will allow ones' to pass, the other zeros'; as with the polarisation process, the filters are selected at random, so we'd expect half of the photons to be blocked by the filtering process. Counterintuitively, however, their quantum mechanical properties mean that even those photons that are blocked' by a filter still have a 50 per cent chance of passing their correct value to the detector. Thus, we'd expect an overall agreement between transmission and detection of 75 per cent (50 per cent that pass straight through plus 25 per cent that are blocked' but still communicate their correct value).

Once enough photons have been transmitted to produce a key of the required length, the parties compare, over a separate channel, the sequence of emitted ones and zeros with the filter used for each, discarding the individual results where they disagree. A classical symmetric encryption key is then created from the remaining string of ones and zeros. This key can be used as an uncrackable one-time pad' which is then used to encrypt data such as a message or a login.

Should a man-in-the-middle intercept the stream of photons, the parties will be alerted because of the observer effect: measuring the state of a quantum particle will change it. Statistically, the number of photons registered as correct' by the detector will drop from 75 per cent to around 62.5 per cent and this will be noticed when the two parties compare a random sample of their results at the end of the process. Any such discrepancy will cause the key to be rejected. Properly implemented, QKD can be considered as a provably unbreakable method of exchanging keys.

Switzerland is a QKD pioneer, deploying the technology to secure electoral votes as far back as 2007. The company that helped to achieve this feat, Geneva University spin-off ID Quantique (IDQ), has since become one of the main manufacturers of QKD and QRNG hardware. CEO Grgoire Ribordy (pictured) has seen an recent upsurge of interest beginning in 2016 when the European Commission unveiled its 1 billion, ten-year Quantum Flagship programme. The market is now starting to mature, he said, adding that his company boasts customers in government, finance and "other organisations that have high-value IP to protect".

There's a certain rivalry between physics and maths, between QKD and post-quantum encryption, not least because funding has been hard to come by. Being hardware-based, QKD has so far gobbled up the lion's share of the research grants, but it's possible that when NIST returns its verdicts more money will flow into PQ. Arguments also rage over the practical limits of security.

"The physicists tend to talk about QKD as being perfectly secure' which sets the cryptographers on edge as there is no such thing in practice," Woodward said.

Ribordy is adamant that both techniques will be required. As with the hybrid approach to adopting algorithms, it's not an either-or situation; it all depends on the use case.

"I think they're actually complementary. Quantum crypto [another name for QKD] will provide a higher security and should be used maybe in backbone networks where there's a lot of at stake, big pipes must be protected with more security, and then the quantum-resistant algorithms can find an application in areas where security is not as critical or maybe where there's less data at stake."

One company that's looking to scale up QKD on a national basis is

the startup Quantum Xchange. Based in Bethesda, Maryland, USA, it was founded in 2018 with VC funding to provide ultra-secure data networks. President and CEO John Prisco (pictured) bemoaned the fact that his country, while forging ahead with quantum computers, is behind the curve when it comes to defending against them. It's possible that by 2024 when NIST selects its winning algorithms, the game will already be up.

"Everybody is saying, OK, let's fight quantum with quantum and I subscribe to that," he said. "We've got quantum computers that are offensive weapons and quantum keys that are the defensive of counterpart to that. The rest of the world outside of the United States is embracing this a lot more quickly - Europe, Japan and China."

Quantum particles are uniquely sensitive to any kind of disturbance, so while China may have successfully transmitted quantum keys between Earth and the Micius satellite, this was only possible because of ideal weather conditions at the time (although, interestingly, Woodward believes it could ultimately be the winning approach).

Particles transmitted through the more common fibreoptic cable are also limited by the tendency of the polarised photons to react with the medium. Even with the most pristine fibre, this limits real-world transmission distance to around 100km. After that, you need intermediary repeaters and trusted nodes' to relay the signal. Since it's not possible to directly clone quantum states, the quantum signal must be converted to classical and then back to quantum again, representing a weak point in the otherwise unbreakable chain. So trusted nodes must be very thoroughly secured, which inevitably increases costs and limits current applications. It is also possible for an attacker to interfere with emitters and detectors to corrupt the key generation process.

Other issues? Well, there's a lack of standards and certifications and the equipment is costly. Also, without some sort of secure signature process, how can parties exchanging keys be sure who they are exchanging them with? In addition, it's restricted to point-to-point communications and it's also incompatible with existing networks.

The theory is sound, said Woodward, but the engineering is still a challenge.

"It's in practice that QKD is encountering difficulties. For example, QKD is not yet at a stage where it is using single photons - it uses pulses of light. Hence, the very basis of not being able to clone the quantum state of a photon is put in question as there is more than one of them."

Woodward added that even after the kinks in QKD - be that via satellite, fibreoptic cables or over the airwaves - have been ironed out, the technology will still likely be confined to highly sensitive data and backbone networks because PQ cryptography will be easier to slot into existing infrastructure.

"Whichever [QKD] scheme proves most reliable and robust they all require that expensive infrastructure over what we have now, and so I can envisage it being used for, possibly, government communications but not for home users whose machines are picking a means to communicate securely with their bank's website," he said.

"The post-quantum schemes in the NIST competition would simply replace the software we already have in places such as TLS so the cost would be much lower, and the level of disruption needed for adoption by end-users would be far less."

However, Quantum Xchange is working on overcoming some of these limitations. The firm already operates a small number of high security QKD connections between financial institutions in New York and datacentres in nearby New Jersey over dedicated fibreoptic cables using trusted nodes to extend the reach of its QKD infrastructure. But it is also working on a hybrid system called Phio TX. This will allow the transmission of electronic quantum keys (i.e. keys created using a QRNG) or classical symmetric keys created from the quantum key via a secure channel separate from that used for the encrypted data. The idea is to make the technology more widely applicable by straddling the QKD-PQ divide and removing the point-to-point restrictions.

"The point is to be crypto-agile," Prisco said. "If a company is trying to come up with a quantum-safe strategy they can implement this product that has quantum-resistant algorithms, electronic quantum keys and optical quantum keys, so it becomes a level-of-service discussion. If you have a link that absolutely has to be protected by the laws of physics, you'd use an optical quantum key. If there's virtually no chance of someone intercepting the data with your key you could use a trusted exchange and the combination of the quantum-resistant algorithm with the quantum random number generated key is very powerful."

Edit: the original article stated the $1.2 billionNational Quantum Initiative Act was passed by the House of Representatives in December 2019 whereas this took place in December 2018.

Original post:
Inside the race to quantum-proof our vital infrastructure - http://www.computing.co.uk

Tucson Morning Blend Top 5 Tech Trends you’ll love this year. Heather Rowe 1:27 – KGUN

NEW TECH STUFF TO MAKE OUR LIVES BETTER IN 2020 In the decade now drawing to a close, every part of our lives our personal lives, our businesses and careers became fully digital. And with the 2020s now upon us, were going to see even more massive changes as the tech we use gets further refined and as technology that was dreamed up only recently becomes part of our daily routines! Here are five of the top technologies that IBM says will revolutionize the year and decade ahead:

1. Artificial Intelligence will turbo-charge productivity both personally, and professionally.

While artificial intelligence probably wont take your job, it will change how you work. In the coming decade, expect to see AI making its way into all sorts of workplaces around the world automating routine tasks that will free up your time to concentrate on parts of your job that are more satisfying and meaningful. And there will be lots of new jobs and career possibilities for those who gain the skills to work in technology fields.

2. Blockchain will help to make the food you eat safer than ever.

Food recalls keep consumers constantly on their toes affecting their shopping habits, and calling produce and pantry items into question. But blockchain networks like IBM Food Trust (which is used by a growing number of retailers including Walmart, Albertsons and Carrefour as well as major food suppliers like Dole) are helping to trace foods from the farm to your fork. What is blockchain? Its a digital ledger that means means consumers now have unprecedented insight into exactly where their food has come from and it doesnt stop with food blockchain now tracks global shipments, marriages and more. Right now were able to track food shipments on the blockchain via apps and in the next decade, well see this cutting edge technology become a part of everyday life.

3. Edge Computing will have a big impact on retail, and on the tech you use on your cell phone.

Today's consumer electronics, cars and electric vehicles, and all sorts of other digital devices are equipped with sensors that collectively generate tons of data. Today theres an estimated 15 billion intelligent devices operating on the outer edges of the network, and by 2022, that number is expected to reach 55 billion. In order to make sense of all of the information from these devices, well see massive growth in whats called edge computing: the use of compact, efficient computer servers located at the networks edges/near these smart devices that can process data locally, instead of sending it all back to a data center via the cloud.. The next decade will see a surge in edge computing, aided by the rollout of 5G technology and while consumers wont see edge computing it will transform the way retailers stock the latest goods you buy, and it will affect how cellphone carriers support mobile gaming and augmented reality and more.

4. From cloud computing to the Hybrid Cloud: what you need to know.

You know how when youre getting ready to pack for a big trip, you need to gather stuff from all over the place to make your vacation work? You might have clothes and shoes spread out between multiple closets, your suitcase is in the basement, your passport (which needs to stay super secure) is in a safe. Well, businesses with lots of data are the same way: they might have some info in one type of cloud, some info in another, and more stuff on three servers in two different states. Thats why more and more businesses are turning to hybrid cloud: its a technology infrastructure that makes it easy for companies to quickly access data wherever its stored to make it usable and easy to analyze. For consumers, this means theyre being helped by retailers and companies more quickly all with their data being safer than ever.5. Quantum computing moves from the realm of the theoretical (and from being a sci-fi movie plotline!) into the world of practical experiments and applications.

Its not necessary to be a quantum physicist to grasp the main point of quantum computing: it seeks to solve complex problems that have been considered unsolvable using classical computers alone. IBM is a leader on making quantum technology available to industry, academia and anyone else inspired by quantum computings potential. As the next decade unspools well see quantum computing moving from the lab to the mainstream and it will start to solve problems in chemistry, medicine and more.

Original post:
Tucson Morning Blend Top 5 Tech Trends you'll love this year. Heather Rowe 1:27 - KGUN

January 9th: France will unveil its quantum strategy. What can we expect from this report? – Quantaneo, the Quantum Computing Source

It is eagerly awaited! The "Forteza" report, named after its rapporteur, Paula Forteza, Member of Parliament for La Rpublique en Marche (political party of actual President Emmanuel Macron), should finally be officially revealed on January 9th. The three rapporteurs are Paula Forteza, Member of Parliament for French Latin America and the Caribbean, Jean-Paul Herteman, former CEO of Safran, and Iordanis Kerenidis, researcher at the CNRS. Announced last April, this report was initially due at the end of August, then in November, then... No doubt the complex agenda, between the social movements in France, and the active participation of the MP in the Parisian election campaign of Cdric Villani, mathematician and dissident of La Rpublique en Marche... had to be shaken up. In any case, it is thus finally on January 9th that this report entitled "Quantum: the technological shift that France will not miss", will be unveiled.

"Entrusted by the Prime Minister in April 2019, the mission on quantum technologies ends with the submission of the report by the three rapporteurs Paula Forteza, Jean-Paul Herteman, and Iordanis Kerenidis. Fifty proposals and recommendations are thus detailed in order to strengthen France's role and international position on these complex but highly strategic technologies. The in-depth work carried out over the last few months, fueled by numerous consultations with scientific experts in the field, has led the rapporteurs to the conclusion that France's success in this field will be achieved by making quantum technologies more accessible and more attractive. This is one of the sine qua non conditions for the success of the French strategy", explains the French National Congress in the invitation to the official presentation ceremony of the report.

The presentation, by the three rapporteurs, will be made in the presence of the ministers for the army, the economy and finance, and higher education and research. The presence of the Minister of the Armed Forces, as well as the co-signature of the report by the former president of Safran, already indicates that military applications will be one of the main areas of proposals, and possibly of funding. Just as is the case in the United States, China or Russia.

Of course, the report will go into detail about the role of research, and of the CNRS, in advances in quantum computing and communication. Of course, the excellent work of French researchers, in collaboration with their European peers, will be highlighted. And of course, France's excellence in these fields will be explained. France is a pioneer in this field, but the important questions are precisely what the next steps will be. The National Congress indicates that this report will present 50 "proposals and recommendations". Are we to conclude that it will be just a list of proposals? Or will we know how to move from advice to action?

These are our pending questions:

- The United States is announcing an investment of USD 1.2 billion, China perhaps USD 10 billion, Great Britain about 1 billion euros, while Amazon's R&D budget alone is USD 18 billion... how can a country like France position itself regarding the scale of these investments? To sum up, is the amount of funds allocated to this research and development in line with the ambitions?

- Mastering quantum technologies are becoming a geopolitical issue between the United States and China. Should Europe master its own technologies so as not to depend on these two major powers? On the other hand, is this not the return of a quantum "Plan calcul from the 60s? How can we avoid repeating the same mistakes?

- Cecilia Bonefeld-Dahl, Managing Director of DigitalEurope recently wrote that Europe risks being deprived of the use of quantum technologies if it does not develop them itself. Christophe Jurzcak, the head of Quantonation, stated that it is not certain that France will have access to quantum technologies if it does not develop them itself. Is this realistic? Do we have the ressources?

- French companies currently invest very little in research in the field of quantum computing. With the exception of Airbus, the main feedback that we know of is in Canada, Australia, Spain, Germany, etc. Should we also help companies to embrace these technologies, or should we only finance research and development on the part of universities and business creators? Is there a support component for companies? So that technologies are not simply developed in France and sold elsewhere, but that France is the leading market for local developments.

See you on January 9th on Decideo for more details and our objective analysis of the content of this document.

View post:
January 9th: France will unveil its quantum strategy. What can we expect from this report? - Quantaneo, the Quantum Computing Source

Superconductor or Not? Exploring the Identity Crisis of This Weird Quantum Material – SciTechDaily

Northeastern researchers have used a powerful computer model to probe a puzzling class of copper-based materials that can be turned into superconductors. Their findings offer tantalizing clues for a decades-old mystery, and a step forward for quantum computing.

The ability of a material to let electricity flow comes from the way electrons within their atoms are arranged. Depending on these arrangements, or configurations, all materials out there are either insulators or conductors of electricity.

But cuprates, a class of mysterious materials that are made from copper oxides, are famous in the scientific community for having somewhat of an identity issue that can make them both insulators and conductors.

Under normal conditions, cuprates are insulators: materials that inhibit the flow of electrons. But with tweaks to their composition, they can transform into the worlds best superconductors.

The finding of this kind of superconductivity in 1986 won its discoverers a Nobel Prize in 1987, and fascinated the scientific community with a world of possibilities for improvements to supercomputing and other crucial technologies.

But with fascination came 30 years of bewilderment: Scientists have not been able to fully decipher the arrangement of electrons that encodes for superconductivity in cuprates.

Arun Bansil, University Distinguished Professor of physics and Robert Markiewicz, professor of physics, are part of a team of researchers who are describing the mechanism by which copper-oxide materials turn from insulators to superconductors. Credit: Matthew Modoono/Northeastern University

Mapping the electronic configuration of these materials is arguably one of the toughest challenges in theoretical physics, says Arun Bansil, University Distinguished Professor of physics at Northeastern. And, he says, because superconductivity is a weird phenomenon that only happens at temperatures as low as -300 F (or about as cold as it gets on Uranus), figuring out the mechanisms that make it possible in the first place could help researchers make superconductors that work at room temperature.

Now, a team of researchers that includes Bansil and Robert Markiewicz, a professor of physics at Northeastern, is presenting a new way to model these strange mechanisms that lead to superconductivity in cuprates.

In a study published in Proceedings of the National Academy of Sciences, the team accurately predicted the behavior of electrons as they move to enable superconductivity in a group of cuprates known as yttrium barium copper oxides.

In these cuprates, the study finds, superconductivity emerges from many types of electron configurations. A whopping 26 of them, to be specific.

During this transition phase, the material will, in essence, become some kind of a soup of different phases, Bansil says. The split personalities of these wonderful materials are being now revealed for the first time.

The physics within cuprate superconductors are intrinsically weird. Markiewicz thinks of that complexity as the classical Indian myth of the blind men and the elephant, which has been a joke for decades among theoretical physicists who study cuprates.

According to the myth, blind men meet an elephant for the first time, and try to understand what the animal is by touching it. But because each of them touches only one part of its bodythe trunk, tail, or legs, for examplethey all have a different (and limited) concept of what an elephant is.

In the beginning, we all looked [at cuprates] in different ways, Markiewicz says. But we knew that, sooner or later, the right way was going to show up.

The mechanisms behind cuprates could also help explain the puzzling physics behind other materials that turn into superconductors at extreme temperatures, Markiewicz says, and revolutionize the way they can be used to enable quantum computing and other technologies that process data at ultra-fast speeds.

Were trying to understand how they come together in the real cuprates that are used in experiments, Markiewicz says.

The challenge of modeling cuprate superconductors comes down to the weird field of quantum mechanics, which studies the behavior and movement of the tiniest bits of matterand the strange physical rules that govern everything at the scale of atoms.

In any given materialsay, the metal in your smartphoneelectrons contained within just the space of a fingertip could amount to the number one followed by 22 zeros, Bansil says. Modeling the physics of such a massive number of electrons has been extremely challenging ever since the field of quantum mechanics was born.

Bansil likes to think of this complexity as butterflies inside a jar flying fast and cleverly to avoid colliding with each other. In a conducting material, electrons also move around. And because of a combination of physical forces, they also avoid each other. Those characteristics are at the core of what makes it hard to model cuprate materials.

The problem with the cuprates is that they are at the border between being a metal and an insulator, and you need a calculation that is so good that it can systematically capture that crossover, Markiewicz says. Our new modeling can capture this behavior.

The team includes researchers from Tulane University, Lappeenranta University of Technology in Finland, and Temple University. The researchers are the first to model the electronic states in the cuprates without adding parameters by hand to their computations, which physicists have had to do in the past.

To do that, the researchers modeled the energy of atoms of yttrium barium copper oxides at their lowest levels. Doing that allows researchers to trace electrons as they excite and move around, which in turn helps describe the mechanisms supporting the critical transition into superconductivity.

That transition, known as the pseudogap phase in the material, could be described simply as a door, Bansil says. In an insulator, the structure of the material is like a closed door that lets no one through. If the door is wide openas it would be for a conductorelectrons pass through easily.

But in materials that experience this pseudogap phase, that door would be slightly open. The dynamics of what transforms that door into a really wide open door (or, superconductor) remains a mystery, but the new model captures 26 electron configurations that could do it.

With our ability to now do this first-principles-parameter-free-type of modeling, we are in a position to actually go further, and hopefully begin to understand this pseudogap phase a bit better, Bansil says.

Reference: Competing stripe and magnetic phases in the cuprates from first principles by Yubo Zhang, Christopher Lane, James W. Furness, Bernardo Barbiellini, John P. Perdew, Robert S. Markiewicz, Arun Bansil, and Jianwei Sun, 8 November 2019, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.1910411116

Here is the original post:
Superconductor or Not? Exploring the Identity Crisis of This Weird Quantum Material - SciTechDaily

The World Keeps Growing Smaller: The Reinvention Of Finance – Seeking Alpha

In the prominent headlines we keep reading about the attempts to keep the world fragmented by imposing tariffs and constraining the exchange of ideas in many ways, but information keeps spreading and with the continued spread of information the world progresses. John Thornhill writes in the Financial Times about how China is completely redesigning finance

Yes, the United States is working through the FinTech era where efforts are being made to use evolving finance and technology to deliver familiar services more efficiently, but the Chinese effort writes Mr. Thornhill, is trying to do something entirely different.

China wants to change the platform.

In the past, I have written about how the United States banking industry has lagged behind the rest of the world is moving toward a more electronic and integrated finance platform. Even in some less developed countries, payment systems have been evolving at a faster pace than in the United States because of the need to reduce the impact of geographical distances.

Only in the past year or two have some of the larger US banks moved forward, trying to develop a more advanced system.

Commercial banks in the United States have been the biggest and most important banks in the world and have concentrated upon the more sophisticated areas of finance, rather than the basic payments systems that are the foundation of the whole financial system. And, although there have been efforts to advance the financial platforms of the American banks, it is somewhat ironic that several of the largest banks have moved toward quantum computers to revolutionize activities like risk management and trading.

Richard Waters writes about how JPMorgan, Chase & Co. and Goldman Sachs and Citigroup have entered this space in the last couple of years.

For example, Mr. Waters quotes Paul Burchard, as a senior researcher at Goldman Sachs: We think theres a possibility this becomes a critical technology.

And, Despite the challenges, advances in quantum hardware have persuaded the banks the time has come to leap.

One can smile at this leap, but what about the basics of banking?

Here Mr. Thornhill writes that The speed at which China has moved from a cash to a digital-payments economy is staggering: some $17 trillion of transactions were conducted online in 2017. Chinas mobile payment volumes are more than 50 times those in the US.

The growth has come from two corporate sources, Alibaba and Tencent. The number of users is staggering.

However, the biggest potential lies ahead. As Mr. Thornhill states, the most enticing opportunities lie abroad. About 1.7 billion people in the world remain unbanked. When they come online they will be looking for cheap, convenient, integrated digital financial services, such as China has pioneered.

China has the chance to rewire 21st-century finance.

The implication here is that United States banks will have to adjust to this payment system that China is spreading to the rest of the world.

In other words, information spreads and even though the spread of information may be constrained in certain parts of the world, it will expand in the areas where there are fewer constraints. This is the way it has always worked throughout history. Quantum computing is currently not the answer for the US banking system.

Oh, yes, it will be fun to design new types of algorithms for quantum computers as Mr. Waters writes, and the first of these involves a class of optimization problems that take advantage of the probabilistic nature of quantum computing to analyze a large number of possible outcomes and pick the most desirable.

But, who is going to own the payments platform?

Mr. Thornhill believes that the trend in finance over the next decade will be led by the Chinese and the payments system that is being developed within China.

This has all sorts of implications for the US banking system, the US economy, and the US political system. A question coming from this conclusion concerns whether or not the US dollar can maintain its position within the world financial system.

When we start trying to insulate ourselves from the world and try and control little pieces of it for ourselves, we tend to lose our place in the bigger picture. This is just another one of the unintended consequences we find in the field of economics.

But, it has huge implications for American banks and the United States banking system. Consequently, this has huge implications for investors in the commercial banking industry. And, it should be put within the context of what is just happening in the United States.

I guess that banking in 2030 will not look at all like what is going on right now.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Visit link:
The World Keeps Growing Smaller: The Reinvention Of Finance - Seeking Alpha

Honeywell names Top 11 Innovations of 2019 – wingsmagazine.com

Honeywell published an online post of what it sees to be the Top 11 breakthrough technologies that will shape the future, with a primary emphasis on aviation as well as the manufacturing and processes helping to drive the industry forward. The following Top 11 list was produced by Honeywell, with the company first describing What the innovation is and then Why it will be impactful. Honeywell notes many of these technologies already had a major influence over the past year.

1. Power for air taxisWhat: This was a major year for advancements in Urban Air Mobility (UAM) and soon air taxis will be a future mode of transportation. This means the airspace will be more crowded than ever. A new Compact Fly-By-Wire system, used in traditional aircraft, has been redesigned for air taxis. It is about the size of a paperback book.

Why its innovative: The compact computer system packs the brains of an aircrafts flight controls into one system. Operating as though the autopilot is always on, it brings agility, stability and safety to future electronic virtual takeoffs and landings.

2. Surveillance cameras forsee buyer behaviorWhat: Security cameras, which traditionally monitor for theft, can now be used to help retailers make decisions about product displays, operating hours and staffing.

Why its innovative: Surveillance systems can predict future trends by monitoring buyer behavior and store patterns. This comes in handy for retailers who can analyze that data and influence how shoppers experience stores, ultimately boosting sales.

3. Access to Quantum ComputingWhat: This long-awaited technology goes from theory to impact with a new partnership with Microsofts Azure Quantum that will give organizations around the world access to quantum computing through an open cloud system.

Why its innovative: Quantum computing is a step closer to becoming a more common reality. Businesses and organizations will be able to use it to tackle problems they never would have attempted before.

4. Intelligent hearing protectionWhat: The VeriShield headset and cloud-based technology monitor noise levels that workers are exposed to, providing real-time alerts when noise exceeds safe levels.

Why its innovative: Managers can remotely monitor sounds affecting workers with a smartphone or mobile computer and alert employees to potential issues. The first-of-its-kind headset collects data on noise patterns and gives insights into long-term exposure. That helps companies develop an effective noise conservation program to protect workers hearing.

5. Robotic cargo unloadingWhat: Robots now can unload tractor trailers full of inventory at distribution centers. The Robotic Unloader eliminates the need for people to work inside the heat of a tractor trailer that can be strenuous and unsafe.

Why its innovative: Artificial intelligence gets the job done without an operator. That improves safety, offsets shortages in staffing and minimizes damage to goods.

6. Predictive airplane maintenanceWhat: With Honeywell Forge for Airlines, software combines individual aircraft and overall airline data into one dashboard, airlines can predict aircraft maintenance to fix parts before they break.

Why its innovative: Because its predictive and not just preventative, the technology helps reduce flight delays caused by unexpected repairs. That helps airlines maximize profits, improve efficiency and safety and protect passengers.

7. Real-time data makes work more efficientWhat: Most of todays global workforce do not work at a desk. These deskless workers in airports, hospitals and other industries often rely on clipboard methods to do their jobs. With Honeywell Forge technology, pen and paper methods can be replaced with mobile computers to input data immediately. Software analyzes that data and gives immediate insight.

Why its innovative: Reducing inefficient steps of inputting data from paper save time and money. It also gives visibility to worker productivity and the ability to harness institutional knowledgea key priority as workforces get older.

8. Digital twins get smart about maintenanceWhat: Businesses that depend on equipment can use digital twin technology to mirror physical assets of a company. The digital version can use data from the physical equipment to predict machine availability, inefficient operations and maintenance needs.

Why its innovative: The ability to predict maintenance can optimize efficiency. Now, instead of having to stop operations or shut down for maintenance, plants can protect uptime and save money.

9. Fast communication during emergenciesWhat: Every second counts in a crisis. Traditional emergency communications may include relatively slow paging or color code signaling. Now, staff at hospitals, schools, airports and other high density buildings can use the Command and Control Suite to customize communications between specific teams, based on the severity of the situation.

Why its innovative: The command and control suite provides enhanced facility visualization, enhanced map navigation and broader editing capabilities.

10. Virtual engineering and controlWhat: A new generation of control system technology which is the hardware and software that operate industrial plants no longer relies on sequential project flows. With Experion Process Knowledge System (PKS) Highly Integrated Virtual Environment (HIVE) the virtualization approach unchains controllers and control applications from physical equipment and shifts day-to-day management of servers to a centralized data center. This allows operators to make late changes without their traditionally inherent risks and re-work.

Why its innovative: The technology simplifies control system design, implementation and lifecycle management. That enables plants to execute projects in less time, at lower cost and lower risk, while improving throughput, quality and operational reliability.

11. Machine learning to fight cyberattacksWhat: In an industrial environment, algorithms that detect anomalies immediately identify risks to systems in industrial controls environments.

Why its innovative: Detecting risk adds an additional layer of protection against cyberattacks. The algorithms analyze for risk that can be missed by common cybersecurity threat detectors. That includes threats like polymorphic malware, which changes constantly to avoid detection, and emerging types of threats. It operates on real-time data to immediately identify new and emerging dangers to industrial control systems and the Industrial Internet of Things.

Read the rest here:
Honeywell names Top 11 Innovations of 2019 - wingsmagazine.com

The Race To Find A Cure For Aging – Medium

We want to look & feel young again, and every year we spend hundreds of billions of dollars on beauty serums, cosmetic surgery, and exotic supplements in the hopes of appearing more vibrant, healthy, and desirable.

All of those products, procedures & pills only cover up the symptoms of aging they do nothing to address the cause. While medicine does help us to live longer, at best it has only slowed the ravages of time, and an aging population is driving demand for alternatives to the gradual decline into senescence.

Aging, once thought to be inevitable, is being challenged. For the first time in history, biomedical innovators are starting to view it in a disease model, and not as an inevitability of life and medical science is working to find a cure.

Here are three stories of people from different walks of life who share a singular goal theyre actively working to extend their own lifespans, and sharing what theyve learned on how to achieve it:

Dr. David Sinclair says the solution is to get your NAD+ levels up and hes offering detailed, practical advice on how to do it. In lengthy interviews with Joe Rogan & Rich Roll, as well as his recent book, he discusses the health benefits of intermittent fasting, limiting sugar & red meat, and eating plenty of vegetables but for Sinclair, thats only the beginning.

Sinclair is an award-winning Australian biologist, professor of genetics, and Founding Director of the Paul F. Glenn Laboratory for the Biological Mechanisms of Aging at Harvard University.

His team of 30+ scientists is deeply engaged in studying the mechanisms involved with aging & senescence, and treatments to potentially reverse them. One of the promising life-extension supplements theyve identified is Metformin an inexpensive blood sugar medication that may extend the human lifespan by as much as 10%.

In addition to Metformin, Sinclair is bullish on the prospects of NMN (nicotinamide monomucleotide) for life extension. This vitamin B-3 derivative converts easily into NAD+ inside your cells, which is claimed to improve cellular function and offer rejuvenating effects seen in human clinical trials.

Sinclair claims to have reversed aging in lab mice, and also claims to have knocked more than two decades off his biological age, as well as boasting online that he has the lung capacity, cholesterol and blood pressure of a young adult and the heart rate of an athlete.

If hes right, aging can reversed with NAD+ boosting supplements and thats a big step in a cure for aging and the diseases that come with it.

Others, like Elizabeth Parrish, the CEO of BioViva Sciences, have taken a different route: she underwent experimental gene therapy to lengthen her telomeres & reduce muscle wasting back in 2016, and claims her health has improved since the treatment.

According to Wikipedia, independent testing by SpectraCell Laboratories had revealed Elizabeth Parrishs leukocyte telomere length had been extended from 6.71kb to 7.33kb but in 2018, she reported further lengthening in her telomeres up to 8.12kb, along with an overall growth in muscle mass.

A telomere is a region of repetitive nucleotide sequences at the end of each chromosome that protects it from damage and telomeres get shorter as we age, leading to a variety of aging-related diseases. The initial 10% increase of Parrishs telomeres has been roughly compared to her cells becoming 20 years younger.

However, critics such as Dr. Bradley Johnson at the University of Pennsylvania have questioned her results, stating, Telomere length measurements typically have low precision with variation in measurements of around 10 percent, which is in the range of the reported telomere lengthening apparently experienced by Elizabeth Parrish.

Meet Jim Green, patient zero in a one man experiment in radical anti-aging. He lacks the Sinclair teams funding and cant bioengineer retroviral delivery systems like the Parrish team, but what he lacks in budget he makes up for in courage, innovation & perseverance.

A few years back, Jim decided to tackle aging head-on, and started doing intense research into published scientific papers on aging, cellular senescence, and supplements that led him to a rigorous health regime that he claims has literally reversed his aging.

Jims published a collection of links and notes to all of his papers online, and from talking with him personally several times I can tell you that hes been more than diligent about his research. Josh Mitteldorf also interviewed him recently, and in that interview Jim talked at length about his use of first a nutraceutical called TA-65 and later Astragalus Root Extract as a telomerase activator to give new life to old cells.

Jim has taken the hard road: consuming copious amounts of Astragalus extract along with countless other supplements and a daily exercise routine thats visibly reversed most signs of his aging including his seeing his gray hair regain its youthful color (no, he doesnt dye it, thats natural).

Rather than trying to hide the signs of aging with makeup or plastic surgery, innovators like Sinclair, Parrish & Green have taken action to turn back the clock in the hopes of not only living longer but also living better.

Sinclair has spoken numerous times about aging leading to a tragic loss of human capital & potential that up until now weve taken for granted, but if the research that these innovators are pursuing bears fruit, then it may no longer be our inevitable fate.

Whatever the results of their experiments may ultimately be, their research alone is a testament to our shared desire to stop the sands of time from passing & make the most of every moment that we have.

See the rest here:

The Race To Find A Cure For Aging - Medium

OZY Takes You Ahead of the Curve in Science and Technology – OZY

As we approach the last days of 2019, OZY is proud to celebrate being first first to bring you stories about scientific breakthroughs, life-changing tech and researchers working at the forefront of their fields. From virtual reality to robots, blockchain to breast cancer, science and tech are racing forward at a breakneck pace and OZY is right there to keep you informed. Today were devoting OZYs Daily Dose to recent articles in which we were ahead of the curve in science and technology.

As part of our Robots of Tomorrow series, we reported on how fitness firms are turning to artificial intelligence to offer affordable, personalized at-home training, relying on technological advances unavailable at the start of the decade.

Venturing farther into the health and wellness space, we introduced you to David Sinclair, a genetics professor at Harvard whose lab is working to develop a drug that interrupts the aging process, with an eye toward preventing age-related diseases such as cancer, dementia and osteoporosis. OZY was the first to show you a new fabric that promises to slim your body and to consider what impact AI might have on reversing the climbing rates of suicide (which hit a 50-year high in the United States in 2017). Now, artificial intelligence, machine learning and natural language processing are spawning a growing number of startups that are tailoring mentalhealth care to an individuals needs and circumstances in ways unimaginable just five years ago.

The next AI frontier? Academia. With evidence that graduation rates at U.S. universities have been plummeting for half a century, colleges are turning to artificial intelligence and data crunching to help turn the tide by using predictive tools to reach students and address their concerns faster, at times even before the students approach college authorities withtheir problems. AI has even found its way onto your plate via apps and personalization platforms that use artificial intelligence to give restaurant brands and their customers the option to customize their menu and food choices.

And for those who prefer to cook at home but dont relish the drive to the grocery store? OZY was the first to report on a growing number of designers working to bring the grocery store (or office or retail shop) to you. Think of it as a future where spaces for retail, play and work will deliver whatever you order like autonomous cars, but bigger.

In our global coverage of science and tech, we wrote about Chinas turn to robot policing; Brazil, where leading researchers and academics are fleeing the nation in record numbers, hobbling the countrys sciences while helping those abroad; and Togo, where entrepreneurial youth are using rudimentary engineering skills to develop printers, robots, computers and games all from electronic waste.

As another year comes to a close, we celebrate the advances and innovations that science and technology make possible. Theres much more to come in 2020, so stay tuned, OZY fans.

See original here:

OZY Takes You Ahead of the Curve in Science and Technology - OZY

Quantum Computers Finally Beat Supercomputers in 2019 – Discover Magazine

In his 2013 book, Schrdingers Killer App, Louisiana State University theoretical physicist Jonathan Dowling predicted what he called super exponential growth. He was right. Back in May, during Googles Quantum Spring Symposium, computer engineer Hartmut Neven reported the companys quantum computing chip had been gaining power at breakneck speed.

The subtext: We are venturing into an age of quantum supremacy the point at which quantum computers outperform the best classical supercomputers in solving a well-defined problem.

Engineers test the accuracy of quantum computing chips by using them to solve a problem, and then verifying the work with a classical machine. But in early 2019, that process became problematic, reported Neven, who runs Googles Quantum Artificial Intelligence Lab. Googles quantum chip was improving so quickly that his group had to commandeer increasingly large computers and then clusters of computers to check its work. Its become clear that eventually, theyll run out of machines.

Case in point: Google announced in October that its 53-qubit quantum processor had needed only 200 seconds to complete a problem that would have required 10,000 years on a supercomputer.

Nevens group observed a double exponential growth rate in the chips computing power over a few months. Plain old exponential growth is already really fast: It means that from one step to the next, the value of something multiplies. Bacterial growth can be exponential if the number of organisms doubles during an observed time interval. So can computing power of classical computers under Moores Law, the idea that it doubles roughly every year or two. But under double exponential growth, the exponents have exponents. That makes a world of difference: Instead of a progression from 2 to 4 to 8 to 16 to 32 bacteria, for example, a double-exponentially growing colony in the same time would grow from 2 to 4 to 16 to 256 to 65,536.

Neven credits the growth rate to two factors: the predicted way that quantum computers improve on the computational power of classical ones, and quick improvement of quantum chips themselves. Some began referring to this growth rate as Nevens Law. Some theorists say such growth was unavoidable.

We talked to Dowling (who suggests a more fitting moniker: the Dowling-Neven Law) about double exponential growth, his prediction and his underappreciated Beer Theory of Quantum Mechanics.

Q: You saw double exponential growth on the horizon long before it showed up in a lab. How?

A: Anytime theres a new technology, if it is worthwhile, eventually it kicks into exponential growth in something. We see this with the internet, we saw this with classical computers. You eventually hit a point where all of the engineers figure out how to make this work, miniaturize it and then you suddenly run into exponential growth in terms of the hardware. If it doesnt happen, that hardware falls off the face of the Earth as a nonviable technology.

Q: So you werent surprised to see Googles chip improving so quickly?

A: Im only surprised that it happened earlier than I expected. In my book, I said within the next 50 to 80 years. I guessed a little too conservatively.

Q: Youre a theoretical physicist. Are you typically conservative in your predictions?

People say Im fracking nuts when I publish this stuff. I like to think that Im the crazy guy that always makes the least conservative prediction. I thought this was far-out wacky stuff, and I was making the most outrageous prediction. Thats why its taking everybody by surprise. Nobody expected double exponential growth in processing power to happen this soon.

Q: Given that quantum chips are getting so fast, can I buy my own quantum computer now?

A: Most of the people think the quantum computer is a solved problem. That we can just wait, and Google will sell you one that can do whatever you want. But no. Were in the [prototype] era. The number of qubits is doubling every six months, but the qubits are not perfect. They fail a lot and have imperfections and so forth. But Intel and Google and IBM arent going to wait for perfect qubits. The people who made the [first computers] didnt say, Were going to stop making bigger computers until we figure out how to make perfect vacuum tubes.

Q: Whats the big deal about doing problems with quantum mechanics instead of classical physics?

A: If you have 32 qubits, its like you have 232 parallel universes that are working on parts of your computation. Or like you have a parallel processor with 232 processors. But you only pay the electric bill in our universe.

Q: Quantum mechanics gets really difficult, really fast. How do you deal with that?

A: Everybody has their own interpretation of quantum mechanics. Mine is the Many Beers Interpretation of Quantum Mechanics. With no beer, quantum mechanics doesnt make any sense. After one, two or three beers, it makes perfect sense. But once you get to six or 10, it doesnt make any sense again. Im on my first bottle, so Im in the zone.

[This story originally appeared in print as "The Rules of the Road to Quantum Supremacy."]

Read more from the original source:

Quantum Computers Finally Beat Supercomputers in 2019 - Discover Magazine

Quantum computing : Solving problems beyond the power of classical computing – Economic Times

Weather forecasting today is good. Can it get better? Sure, it can, if computers can be better. This is where quantum computers come into the picture. They possess computing capacity beyond anything that todays classical computers can ever achieve. This is because quantum computers can run calculations exponentially faster than todays conventional binary computers. That makes them powerful enough to bridge gaps which exist in todays weather forecasting, drug discovery, financial modelling and many other complex areas.

Classical computing has been the backbone of modern society. It gave us satellite TV, the internet and digital commerce. It put robots on Mars and smartphones in our pockets.

But many of the worlds biggest mysteries and potentially greatest opportunities remain beyond the grasp of classical computers, says Stefan Filipp, quantum scientist at IBM Research. To continue the pace of progress, we need to augment the classical approach with a new platform, one that follows its own set of rules. That is quantum computing.

Classical computing is based on the binary system, where the fundamental carriers of information bits can take on a value of either 0 or 1.

All information is stored and read as a sequence of 0s and 1s. A state of 0 is off (or false) and a state of 1 is on (or true). Unlike bits, quantum bits or qubits can have multiple values or states between 0 and 1, enabling them to store different types of information.

Superposition and entanglement are two fundamental properties of quantum objects. The ability to manipulate these properties is what makes quantum algorithms fundamentally different from classical algorithms.

Quantum computers working with classical systems have the potential to solve complex real-world problems such as simulating chemistry, modelling financial risk and optimising supply chains.

For example, Exxon Mobil plans to use quantum computing to better understand catalytic and molecular interactions that are too difficult to calculate with classical computers. Potential applications include more predictive environmental models and highly accurate quantum chemistry calculations to enable the discovery of new materials for more efficient carbon capture.

JP Morgan Chase is focusing on use cases for quantum computing in the financial industry, including trading strategies, portfolio optimisation, asset pricing and risk analysis.

In India, the government has launched two initiatives in the emerging field a networked programme on Quantum Information Science and Technology (QuST) and the National Mission on Quantum Technologies & Applications (NMQTA).

Despite all the progress, practical and working quantum systems might take most of the 2020s. And you wont see or need a quantum machine on your desk. These will be used by governments and large enterprises, unless you want to find aliens or figure out and execute ways to boil the ocean while sitting at home.

This story is part of the 'Tech that can change your life in the next decade' package

View post:

Quantum computing : Solving problems beyond the power of classical computing - Economic Times

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing – Analytics India Magazine

Quantum computing has come a long way since its first introduction in the 1980s. Researchers have always been on a lookout for a better way to enhance the ability of quantum computing systems, whether it is in making it cheaper or the quest of making the present quantum computers last longer. With the latest technological advancements in the world of quantum computing which superconducting bits, a new way of improving the world of silicon quantum computing has come to light, making use of the silicon spin qubits for better communication.

Until now, the communication between different qubits was relatively slow. It could be done by passing the messages to the next bit to get the communication over to another chip at a relatively far distance.

Now, researches at Princeton University have explored the idea of two quantum computing silicon components known as silicon spin qubits interacting in a relatively spaced environment, that is with a relatively large distance between them. The study was presented in the journal Nature on December 25, 2019.

The silicon quantum spin qubits give the ability to the quantum hardware to interact and transmit messages across a certain distance which will provide the hardware new capabilities. With transmitting signals over a distance, multiple quantum bits can be arranged in two-dimensional grids that can perform more complex calculations than the existing hardware of quantum computers can do. This study will help in better communications of qubits not only on a chip but also from one to another, which will have a massive impact on the speed.

The computers require as many qubits as possible to communicate effectively with each other to take the full advantage of quantum computings capabilities. The quantum computer that is used by Google and IBM contains around 50 qubits which make use of superconducting circuits. Many researchers believe that silicon-based qubit chips are the future in quantum computing in the long run.

The quantum state of silicon spin qubits lasts longer than the superconducting qubits, which is one of their significant disadvantages (around five years). In addition to lasting longer, silicon which has a lot of application in everyday computers is cheaper, another advantage over the superconducting qubits because these cost a ton of money. Single qubit will cost around $10,000, and thats before you consider research and development costs. With these costs in mind a universal quantum computer hardware alone will be around at least $10bn.

But, silicon spin cubits have their challenges which are part of the fact that they are incredibly small, and by small we mean, these are made out from a single electron. This problem is a huge factor when it comes to establishing an interconnect between multiple qubits when building a large scale computer.

To counter the problem of interconnecting these extremely small silicon spin qubits, the Princeton team connected these qubits with a wire which are similar to the fibre optic (for internet delivery at houses) wires and these wires carry light. This wire contains photon that picks up a message from a single qubit and transmits it the next qubit. To understand this more accurately, if the qubits are placed at a distance of half-centimetre apart from each other for the communication, in real-world, it would be like these qubits are around 750 miles away.

The next step forward for the study was to establish a way of getting qubits and photons to communicate the same language by tuning both the qubits and the photon to the same frequency. Where previously the devices architecture allowed tuning only one qubit to one photon at a time, the team now succeeded in tuning both the qubits independent from each other while still coupling them to the photon.

You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other,

Felix Borjans, a graduate student and first author on the study on what he describes as the challenging part of the work.

The researchers demonstrated entangling of electrons spins in silicon separated by distances more substantial than the device housing, this was a significant development when it comes to wiring these qubits and how to lay them out in silicon-based quantum microchips.

The communication between the distant silicon-based qubits devices builds on the works of Petta research team in 2010 which shows how to trap s single electron in quantum wells and also from works in the journal Nature from the year 2012 (transfer of quantum information from electron spins)

From the paper in Science 2016 (demonstrated the ability to transmit information from a silicon-based charge qubit to a photon), from Science 2017 (nearest-neighbour trading of information in qubits) and 2018 Nature (silicon spin qubit can exchange information with a photon).

This demonstration of interactions between two silicon spin qubits is essential for the further development of quantum tech. This demonstration will help technologies like modular quantum computers and quantum networks. The team has employed silicon and germanium, which is widely available in the market.

comments

Sameer is an aspiring Content Writer. Occasionally writes poems, loves food and is head over heels with Basketball.

More here:

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing - Analytics India Magazine