Unspinning the secrets of spider webs – Australian Geographic – Australian Geographic

Home News Unspinning the secrets of spider webs

By Esme MathisOctober 19, 2022

Stronger than steel and more elastic than rubber, spider silk has the potential to transform medicine, engineering, and materials science if only we learn how to produce it.

A new global study, involving University of New South Wales scientists, has analysed the silk properties of spiders across Oceania, Asia, Europe and the USA to better understand how this natural wonder can be emulated in future biomaterials. The research, published in Science Advances, catalogued the silk gene sequences of 1098 species from 76 families.

Up until now, there was a pretty good literature set of how spider silk performs, says Dr Sean Blamires, an evolutionary ecological biologist from UNSW Sydneys School of Biological, Environmental and Earth Sciences. But what has been lacking is a way to generalise across spiders and find out what causes specific properties. Is there a link between genes, protein structures and fibres?

According to Sean, the large data set collected over five years allow scientists to create complex models, using machine learning to understand how and why specific silk properties vary between species, and even between individual spiders.

Just like the Human Genome project has given researchers the ability to identify specific gene sequence mutations that cause specific diseases, this database gives biologists and material scientists the ability to derive direct genetic causes for the properties of spider silk, he says.

There are seven types of spider silk, secreted from different glands within a spider. Out of these, dragline silk is the crowning glory. Known for its strength, durability and flexibility, dragline silk has captured scientists imagination for decades with its tantalising potential.

In a spiderweb, the dragline silk makes up the framework and the radials. Its also the silk that the spider uses when it drops off a web, says Sean. Non-web building spiders might use it to make retreats or use it for signalling with each other, while trapdoor spiders use something very similar.

In Australia, the dragline silk produced by orb-weaving spiders is so tough that it outperforms Kevlar and steels. Its tough, but also flexible.

Most materials are either one or the other, says Sean.

The study measured the mechanical, thermal, structural and hydration properties of dragline silks.

Its hoped this research will provide a blueprint for renewable, biodegradable and sustainable biopolymers. Suggested uses for this lightweight material ranges from bulletproof vests, flexible building materials, biodegradable bottles, and even a non-toxic biomaterial in regenerative medicine, that can be used as a scaffold to grow and repair damaged nerves or tissues.

View original post here:
Unspinning the secrets of spider webs - Australian Geographic - Australian Geographic

A sound approach for effective gene therapy delivery to brain – The Source – Washington University in St. Louis – Washington University in St. Louis

Researchers have been experimenting with different ways to deliver genes to the brain to treat central nervous system diseases and tumors. One of the obstacles, however, is the ability to penetrate the blood-brain barrier while having minimal effect on the other organs in the body.

Hong Chen, associate professor of biomedical engineering at the McKelvey School of Engineering and of radiation oncology at the School of Medicine, both at Washington University in St. Louis, and her team found an effective method to overcome that obstacle using focused ultrasound intranasal delivery (FUSIN). In new research, they found that the intranasally delivered gene therapy had comparable or better outcomes than existing methods while having minimal effect on the bodys other organs.

Results of the research, led by Chen and Dezhuang Ye, a postdoctoral research associate, and collaborators, were published online in the journal eBioMedicineSept. 21. It is the first study to evaluate the potential of FUSIN to deliver adeno-associated viral vectors, small viruses used to deliver gene therapy, in a mouse model.

Read more on the engineering website.

Visit link:
A sound approach for effective gene therapy delivery to brain - The Source - Washington University in St. Louis - Washington University in St. Louis

NIST Cloud Computing Program – NCCP | NIST

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics (On-demand self-service, Broad network access, Resource pooling, Rapid elasticity, Measured Service); three service models (Cloud Software as a Service (SaaS), Cloud Platform as a Service (PaaS), Cloud Infrastructure as a Service (IaaS)); and, four deployment models (Private cloud, Community cloud, Public cloud, Hybrid cloud). Key enabling technologies include: (1) fast wide-area networks, (2) powerful, inexpensive server computers, and (3) high-performance virtualization for commodity hardware.

The Cloud Computing model offers the promise of massive cost savings combined with increased IT agility. It is considered critical that government and industry begin adoption of this technology in response to difficult economic constraints. However, cloud computing technology challenges many traditional approaches to datacenter and enterprise application design and management. Cloud computing is currently being used; however, security, interoperability, and portability are cited as major barriers to broader adoption.

The long term goal is to provide thought leadership and guidance around the cloud computing paradigm to catalyze its use within industry and government. NIST aims to shorten the adoption cycle, which will enable near-term cost savings and increased ability to quickly create and deploy enterprise applications. NIST aims to foster cloud computing systems and practices that support interoperability, portability, and security requirements that are appropriate and achievable for important usage scenarios

Read the original here:

NIST Cloud Computing Program - NCCP | NIST

cloud-computing GitHub Topics GitHub

cloud-computing GitHub Topics GitHub Here are 1,464 public repositories matching this topic...

Learn and understand Docker&Container technologies, with real DevOps practice!

A curated list of software and architecture related design patterns.

Pulumi - Universal Infrastructure as Code. Your Cloud, Your Language, Your Way

High-Performance server for NATS.io, the cloud and edge native messaging system.

A curated list of Microservice Architecture related principles and technologies.

Cloud Native application framework for .NET

A curated list of awesome services, solutions and resources for serverless / nobackend applications.

Cloud Native Control Planes

A comprehensive tutorial on getting started with Docker!

Rules engine for cloud security, cost optimization, and governance, DSL in yaml for policies to query, filter, and take actions on resources

Service Fabric is a distributed systems platform for packaging, deploying, and managing stateless and stateful distributed applications and containers at large scale.

Open, Multi-Cloud, Multi-Cluster Kubernetes Orchestration

The open-source data integration platform for security and infrastructure teams

Web-based Cloud Gaming service for Retro Game

A list of resources in different fields of Computer Science

Example of a cinema microservice

This repository consists of the code samples, assignments, and notes for the DevOps bootcamp of Community Classroom.

Awesome Cloud Security Resources

Source code accompanying book: Data Science on the Google Cloud Platform, Valliappa Lakshmanan, O'Reilly 2017

TFHE: Fast Fully Homomorphic Encryption Library over the Torus

Add a description, image, and links to the cloud-computing topic page so that developers can more easily learn about it.

Curate this topic

To associate your repository with the cloud-computing topic, visit your repo's landing page and select "manage topics."

Learn more

Original post:

cloud-computing GitHub Topics GitHub

Benefits of Cloud Computing That Can Help You With Your Business 2023 – ReadWrite

Today, businesses are looking to operate more flexibly and cost-effectively. This has led to the rise of cloud computing as a viable solution for almost every business. Cloud computing uses a network of remote servers hosted on the Internet and accessible through standard web browsers or mobile apps.

It enables users to store data remotely, exchange files, and access software applications from anywhere with an internet connection. In addition, individuals and businesses that use the cloud can access their data from any computer or device connected to the internet, allowing them to sync their settings and files wherever they go.

There are many advantages to using cloud services for your business. Here are 10 benefits of cloud computing that can help you with your business.

One of the most significant benefits of cloud computing is its security. If you run your business on the cloud, you dont have to worry about protecting your data from hackers or other threats.

Cloud providers use industry-standard security practices to keep your data safe, including firewalls, encryption, and authentication systems.

You can further customize your businesss security settings if your business uses a private cloud. For example, if an employee loses or misplaces a device that has access to your data, you can remotely disable that device without putting your data at risk.

You can also encrypt your data to protect it against cyber threats. Businesses can also use multi-factor authentication (MFA) to protect their data further. MFA requires users to input a one-time passcode sent to their phone to log in and confirm their identity.

Another advantage of cloud computing is its scalability. Cloud providers offer scalable cloud solutions that you can adjust to meet your businesss needs.

You can scale up or down your system on demand to deal with seasonal traffic or unexpected spikes in usage. This allows you to avoid buying too much computing power and resources upfront and allows your business to adjust to changes in demand quickly.

You can also try out a cloud solution before you commit to it by renting a smaller instance for a trial period. Cloud solutions are also flexible enough for you to upgrade or downgrade your solutions as your business scales up or down.

This means that you dont have to buy more computing power than you need upfront, and you dont have to upgrade your systems again if your business starts to slow down.

Cloudcomputing can help you achieve greater flexibility and mobility if your business relies on people working remotely. With cloud solutions, you can access your data and run your applications from any computer or device connected to the internet.

When you can access all your data from anywhere, employees can work from home, in coffee shops, or other locations without sacrificing productivity. In addition, cloud providers offer a wide range of collaboration and communication tools that work with their services.

You can also use these tools to collaborate and communicate with clients and vendors who dont need access to your companys data.

Another advantage of cloud computing is its consistency. While different people and departments may use other devices and software, cloud solutions ensure everyone has a consistent experience.

This prevents miscommunications and ensures that everyone is on the same page. Whether you use Office 365, Google G Suite, Salesforce, or another cloud service, your business will have a consistent experience across platforms.

You can also use tools, like identity integration, to access information from different applications without switching between them.

Cloud solutions offer significant cost reductions over the long run compared to other IT solutions. You can save money on hardware, upgrades, and software licenses while enjoying a flexible and scalable solution.

Cloud providers handle all the maintenance and upgrade of their systems, so you dont have to worry about keeping up with the latest trends in IT.

Cloud solutions offer significant cost reductions over the long run compared to other IT solutions. You can save money on hardware, upgrades, and software licenses while enjoying a flexible and scalable solution.

You can easily integrate multiple cloud services to streamline your workflows if your business uses various cloud services.

Many cloud services have a wide range of integrations with other services that you can use to enhance your business processes. For example, you can use Salesforce to manage your leads and close rates and Zapier to link it with other business tools like Gmail, Mailchimp, and Google Calendar.

You can also use a hybrid cloud solution that lets you keep your data close to home while accessing additional IT services through the cloud.

Cloud solutions offer unlimited storage, unlike other data storage solutions like on-premise computers. So while you can scale down your cloud solution if you dont need as much storage for your data, you can also increase your storage later.

You can also use a hybrid solution to keep some of your data local while storing other data in the cloud.

Another advantage of cloud computing is faster performance. In addition, if you use the cloud, you arent limited by your hardware, and your systems are more scalable.

This means that your website and other business applications will perform faster without you having to make hardware upgrades.

You can also use a hybrid solution to improve your performance by keeping your most critical data close to home while accessing other data in the cloud.

Cloud solutions offer a collaborative online environment that lets you share important information with clients and vendors. You can use collaboration tools like wikis, blogs, and forums to work with team members and manage your projects.

You can also use collaboration tools to communicate with clients and vendors who dont need access to your companys data. These tools let you share documents, collaborate on tasks, and manage your workflow from a single platform.

Even though control is an essential aspect of a companys success, there are certain things that you cannot control. Whether your organization controls its own procedures or not, there are certain things that are out of your control. In todays market, even a small amount of downtime has a significant impact.

Business downtime leads to lost productivity, revenue, and reputation. Although you cant prevent or foresee all the catastrophes, there is something you can do to speed up your recovery. Cloud-based data recovery services provide quick recovery for emergency situations, such as natural disasters and electrical outages.

The range of benefits of Cloud computingmakes it a viable solution for almost every business. It offers many advantages that can help you streamline your workflow, achieve better performance, and operate more efficiently.

Suvigya Saxena is the Founder & CEO of Exato Software, a global ranking mobile, cloud computing and web app development company. With 15+ years of experience of IT Tech which are now global leaders with creative solutions, he is differentiated by out-of-the-box IT solutions throughout the domain.

Go here to see the original:

Benefits of Cloud Computing That Can Help You With Your Business 2023 - ReadWrite

The Top 5 Cloud Computing Trends In 2023 – Forbes

The ongoing mass adoption of cloud computing has been a key driver of many of the most transformative tech trends, including artificial intelligence (AI), the internet of things (IoT), and remote and hybrid working. Going forward, we can expect to see it becoming an enabler of even more technologies, including virtual and augmented reality (VR/AR), the metaverse, cloud gaming, and even quantum computing.

The Top 5 Cloud Computing Trends In 2023

Cloud computing makes this possible by removing the need to invest in buying and owning the expensive infrastructure required for these intensive computing applications. Instead, cloud service providers make it available "as-a-service," running on their own servers and data centers. It also means companies can, to some extent, avoid the hassle of hiring or training a highly specialized workforce if they want to take advantage of these breakthrough technologies.

In 2023 we can expect to see companies continuing to leverage cloud services in order to access new and innovative technologies as well as drive efficiencies in their own operations and processes. Heres a rundown of some of the trends that I believe will have the most impact.

Increased investment in cloud security and resilience

Migrating to the cloud brings huge opportunities, efficiencies, and convenience but also exposes companies and organizations to a new range of cybersecurity threats. On top of this, the growing pile of legislation around how businesses can store and use personal data means that the risk of fines or (even worse) losing the trust of their customers is a real problem.

As a result, spending on cyber security and building resilience against everything from data loss to the impact of a pandemic on global business will become even more of a priority during the coming year. However, as many companies look to cut costs in the face of a forecasted economic recession, the emphasis is likely to be on the search for innovative and cost-efficient ways of maintaining cyber security in order to get the most "bang for the buck." This will mean greater use of AI and predictive technology designed to spot threats before they cause problems, as well as an increase in the use of managed security-as-a-service providers in 2023.

Multi-cloud is an increasingly popular strategy

If 2022 was the year of hybrid cloud, then 2023 could be the year that businesses come to understand the advantages of diversifying their services across a number of cloud providers. This is a strategy known as taking a multi-cloud approach, and it offers a number of advantages, including improved flexibility and security.

It also prevents organizations from becoming too tied in to one particular ecosystem - a situation that can create challenges when cloud service providers change the applications they support or stop supporting particular applications altogether. And it helps to create redundancy that reduces the chance of system errors or downtime from causing a critical failure of business operations.

Adopting a multi-cloud infrastructure means moving away from potentially damaging business strategies such as building applications and processes solely around one particular cloud platform, e.g., AWS, Google Cloud, or Microsoft Azure. The growing popularity of containerized applications means that in the event of changes to service levels, or more cost-efficient solutions becoming available from different providers, applications can be quickly ported across to new platforms. While back in 2020, most companies (70 percent) said they were still tied to one cloud service provider, reports have found that 84% of mid-to-large companies will have adopted a multi-cloud strategy by 2023, positioning it as one of the years defining trends in cloud computing.

The AI and ML-powered cloud

Artificial intelligence (AI) and machine learning (ML) are provided as cloud services because few businesses have the resources to build their own AI infrastructure. Gathering data and training algorithms require huge amounts of computing power and storage space that is generally more cost-efficient to rent as-a-service. Cloud service providers are increasingly relying on AI themselves for a number of tasks. This includes managing the vast, distributed networks needed to provide storage resources to their customers, regulating the power and cooling systems in data centers, and powering cyber security solutions that keep their data safe. In 2023, we can expect to see continued innovation in this field as hyper scale cloud service providers like Amazon, Google, and Microsoft continue to apply their own AI technology to create more efficient and cost-effective cloud services for their customers.

Low-code and no-code cloud services

Tools and platforms that allow anybody to create applications and to use data to solve problems without getting their hands dirty with writing computer code are increasingly popular. This category of low-code and no-code solutions includes tools for building websites, web applications and designing just about any kind of digital solutions that companies may need. Low-code and no-code solutions are even becoming available for creating AI-powered applications, drastically lowering the barriers to entry for companies wanting to leverage AI and ML. Many of these services are provided via the cloud meaning users can access them as-a-service without having to own the powerful computing infrastructure needed to run them themselves. Tools like Figma, Airtable, and Zoho allow users to carry out tasks that previously would have required coding experience, such as designing websites, automating spreadsheet tasks, and building web applications, and I see providing services like this as an area where cloud technology will become increasingly useful in 2023 and beyond.

Innovation and consolidation in cloud gaming

The cloud has brought us streaming services like Netflix and Spotify, which have revolutionized the way we consume movies, TV, and music. Streaming video gaming is taking a little longer to gain a foothold but is clearly on its way, with Microsoft, Sony, Nvidia, and Amazon all offering services in this field. It hasnt all been plain sailing, however Google spent millions of dollars developing their Stadia streaming gaming service only to retire it this year due to lack of commercial success. One of the problems is the networks themselves streaming video games clearly requires higher bandwidth than music or videos, meaning it's restricted to those of us with good quality high-speed internet access, which is still far from all of us. However, the ongoing rollout of 5G and other ultra-fast networking technologies should eventually solve this problem, and 2023 could be the year that cloud gaming will make an impact. Google themselves have said that the technology itself that powers Stadia will live on as the backbone of an in-development B2B game streaming service that will allow game developers to provide streaming functionality directly to their customers. If, as many predict, cloud gaming will become the killer app for 5G in the same way that streaming video was for 4G and streaming music was for 3G, then 2023 could be the year when we start to see things fall into place.

To stay on top of the latest on the latest business and tech trends, make sure to subscribe to my newsletter, follow me on Twitter, LinkedIn, and YouTube, and check out my books Tech Trends in Practice and Business Trends in Practice, which just won the 2022 Business Book of the Year award.

Visit link:

The Top 5 Cloud Computing Trends In 2023 - Forbes

Alibaba Invests in Cloud Computing Business With New Campus – ETF Trends

Chinese tech giant Alibaba Group Holding Ltd. has opened a new campus for its cloud computing unit, Alibaba Cloud, in its home city of Hangzhou. Per the South China Morning Post, which Alibaba owns, the 10-building, 2.1 million-square-foot campus is roughly the size of the 2 million-square-foot campus for Googles Silicon Valley headquarters, aka the Googleplex, in Mountain View, California.

Alibaba Cloud also highlighted the campuss eco-friendly designs in a video, including a photovoltaic power generation system, flowerpots made from recycled plastic, and high-efficiency, low-energy devices in the on-site coffee shop, according to SCMP.

The new campus signals the firms commitment to investing in its growing cloud computing business. While Alibabas net income dropped 50% year-over-year in the second quarter to 22.74 billion yuan ($3.4 billion), Alibaba Cloud experienced the fastest growth among all of Alibabas business segments in Q2, making up 9% of total revenue.

The new facilities also come at a time when Chinas economy has been facing a slowdown. While Chinas economy is slowing down, Alibabas cloud computing unit has been eyeing expansion opportunities overseas. For example, Alibaba Cloud announced last month a $1 billion commitment to upgrading its global partner ecosystem.

Alibaba is currently thethird-largest holding in EMQQ Globals flagship exchange-traded fund, the Emerging Markets Internet & Ecommerce ETF (NYSEArca: EMQQ) with a weighting of 7.01% as of October 14. EMQQ seeks to offer investors exposure to the growth in internet and e-commerce activities in the developing world as middle classes expand and affordable smartphones provide unprecedentedly large swaths of the population with access to the internet for the first time, according to the issuer.

EMQQ tracks an index of leading internet and e-commerce companies that includes online retail, search engines, social networking, online video, e-payments, online gaming, and online travel.

For more news, information, and strategy, visit ourEmerging Markets Channel.

Read more from the original source:

Alibaba Invests in Cloud Computing Business With New Campus - ETF Trends

Cloud Computing: A catalyst for the IoT Industry – SiliconIndia

Cloud computing is a great enabler for todays businesses for a variety of reasons. It helps companies, particularly small and medium enterprises jumpstart their operations sooner as there is very little lead time needed to stand up a full-fledged in-house IT infrastructure. Secondly, it eases the financial requirements by avoiding heavy capex and turning the IT costs into an opex model. Even more advantageous is the opex costs can be scaled up and down dynamically based on demand thus optimizing IT costs.

I think Cloud computing became a catalyst for the IoT industry and the proliferation that is seen today probably may not have happened in the absence of Cloud integration. Typically, IoT devices like sensors generate huge amounts of data that require both storage and processing thus making Cloud platforms the perfect choice for building IoT-based solutions. In an IoT implementation, apart from data assimilation there are some fundamental aspects like security, managing devices, etc. that needs to be considered and Cloud platforms take over some of these implementation aspects enabling the solution provider to focus on the core problem.

An interesting case study of how IoT and Cloud technologies can help to create innovative solutions was presented in a Microsoft conference few years back. Its a solution developed to monitor the pollution levels in Ganges which is a project sponsored by Central Pollution Control Board. For more information, readers could go to this link https://azure.microsoft.com/en-us/blog/cleaning-up-the-ganges-river-with-help-from-iot/

Digital technology in the financial services

When we talk about disruptive digital technologies in Financial Services industry, perhaps Blockchain is the one that stands out immediately. The concept of DLT (Decentralised Ledger Technology) has been around for some time and theres lots of interest in leveraging this technology primarily for transparency and efficiency reasons. After an article by Reserve Bank of India in 2020, many Indian banks responded to this initiative by starting to look at opportunities that involve DLT. For e.g. State Bank of India tied up with JP Morgan to use their Blockchain technology.

Adoption of Blockchain could simplify Inter-bank payment settlement and perhaps could be extended in future to cross-border payment settlements across different DLT platforms. It could also be used for settlement of securitized assets by putting them on a common ledger. Another application is using DLT for KYC whereby multiple agencies (like banks) can access customer data from a decentralized and secure database. In fact, EQ uses Blockchain in its product offering to privately funded companies and PEs for Cap table management.

The next one is probably Artificial Intelligence (AI) and Machine Learning (ML) which is predominantly being applied in Financial Services industry in managing internal and external risks. AI-based algorithms now underpin risk-based pricing in Insurance sector and in reducing NPAs in the Banking sector. The technology helps banks predict defaults and take proactive measures to mitigate that risk.

In the Indian context, Unified Payments Interface (UPI) and Aadhar-enabled Payment Service (AePS) are classic examples of disruptive products in financial services industry.

Effective Network Security acts as a gatekeeper

In todays connected world where much of the commerce happens online, its imperative businesses focus on security to safeguard them from threats in cyberspace. The recent approach to Network security is Zero Trust model which basically means never trusts any user/device unless verified. In this model, mutual authentication happens between the two entities in multiple ways, for e.g. using User credentials followed by a second factor like an OTP and sometimes application authentication happens through a digital certificate. The process also uses analytics and log analysis to detect abnormalities in user behaviour and enforce additional authenticating measures while sending alerts at the same time. This is something many of us might have come across when we try to connect to an application from a new device that the application is not aware of. The security mechanism might enforce additional authentication whilst sending an alert to us. Nowadays, businesses also use innovative methods of authentication like biometrics, voice recognition, etc. and some of these are powered by AI/ML.

Fintech players leverage Artificial Intelligence to bridge the gap in MSME lending

I think MSME lending (maybe Retail Lending too) is one of the segments significantly disrupted by technology. In a way, it has opened unconventional options for MSMEs to attract capital both for capex and working capital requirements. There are products ranging from P2P lending to Invoice Discounting offered by Fintech companies which is opening up a new market place. There are Fintech players interested in lending in this space and they use AI/ML models to predict probability of defaults and assess credit risk and appropriately hedge against them.

See the original post here:

Cloud Computing: A catalyst for the IoT Industry - SiliconIndia

Revitalising data and infrastructure management through cloud – ETCIO South East Asia

The Cloud has been a significant contributor to the digital optimisation and transformation of businesses and institutions globally since the 2010s. It seems almost an eternity ago, when the IT department was essentially a support function, with KRAs around design and delivery of Information Technology Architecture encompassing Infrastructure, Data Centres and constituent servers, personal computers, software, networking and security systems, along with the associated, vendor evaluation, outsourcing, contracting, commissioning and finally aligning with business systems and goals, as this pre-millennium Research Gate paper indicates.

The one and a half decades since the advent of the millennium saw the rise of many trends, besides the cloud, such as shift from integrated to business specific applications, resulting data management and insights, globalisation, adoption of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), implosion of telecom, mobility and Mobile Backend as a Service (MBaaS), other technologies such as social media, E-commerce, Extended Reality, Digital Twins, AI/ ML, RPA, Internet of things, Blockchain and Chatbots and lastly, the growing skill-gaps and demands of talent.

The cloud has now taken over a major chunk of responsibilities pertaining to infrastructure, data centre, hosting, SaaS and architectural applications, platforms, networking and security functions thus freeing up the IT and Business Teams for leveraging technology for more strategic tasks related to operations, customers, R&D, supply chain and others. The cloud, hence enabled companies and institutions to leverage the amalgamation of technology, people and processes across their extended enterprises to have ongoing digital programmes for driving revenue, customer satisfaction, and profitability. The cloud can potentially add USD 1 Trillion of Economic Value across the Fortune 500 band of companies by 2030, as this research by McKinsey estimates.

Even before the pandemic, although the initial adoption of Cloud was in the SaaS applications, IaaS and PaaS was surely catching up, thus shifting away from the traditional Data Centre and On-premise Infrastructure. Gartners research way back in 2015 predicted a 30 % plus increase in the IaaS spending with the public cloud IaaS workloads finally surpassing those of on-premise loads. In the same year, Gartners similar paper highlighted significant growth in PaaS as well : both for Infrastructure and Application iPaaS.

The Cloud is adding significant value across Industry verticals and business functions, right from remote working with online meetings & collaboration tools, automated factory operations, extended reality, digital twins, remote field services and many others. The cloud has also been adopted as the platform for deploying other new technologies such as RPA and Artificial Intelligence/ Machine Learning (AI/ ML) Depending on industry best practices, business use cases and IT strategies, it became feasible to leverage infrastructure, assets, applications, assets, and software, in a true Hybrid / Multi/ Industry Cloud scenario with separate private and public cloud environments covering IaaS, PaaS, SaaS and MBaaS. As Platforms were maturing, organisations were furthermore transitioning from Virtual Machine and even from IaaS based solutions to PaaS based. Gartner had predicted in this research that by 2021, over 75% of enterprises and mid-sized organisations would adopt a hybrid or multi-cloud strategy.

There was also a clear transition from the traditional lift and shift to the cloud native approach, which makes full use of cloud elasticity and optimisation levers and moreover minimises technical debt and inefficiencies. This approach makes use of cloud computing to build and run microservices based scalable applications running in virtualised containers, orchestrated in the Container-as-a-service applications and managed and deployed using DevOps workflows. Microservices, container management, infrastructure as a code, serverless architectures, declarative code and continuous integration and delivery (CI/CD) are the fundamental tenets of this cloud native approach. Organisations are balancing use of containerization along with leveraging the cloud hosting provider capabilities especially considering the extent of hybrid cloud, efforts and costs of container infrastructure and running commodity applications.

From the architecture standpoint, cloud-based composable architectures such as MACH- Microservices based, API-first, Cloud-native SaaS and Headless and Packaged business capabilities (PBCs) are increasingly being used in organisations for enhancing Digital Experience Platforms enabling customers, employees and supply chain with the new age Omnichannel experience. These architectures facilitate faster deployment and time to market through quick testing by sample populations and subsequent full-fledged implementations. These composable architectures help organisations in future proofing their IT investments, and improve business resilience and recovery with the ability to decouple and recouple the technology stacks. At the end of the 1st year of the pandemic, Gartner here highlighted the importance of composable architecture in its Hype Cycle of 2021 especially in business resilience and recovery during a crisis.

Intelligently deploying Serverless Computing in the architecture also enhances cloud native strategies immensely, thus enabling developers to focus on triggers and running function/ event-based computing, also resulting in more optimised cloud economics. Also, access to the cloud service providers Function-as-a-service (FaaS) and Backend-as-service (BaaS) models significantly reduce the IT environment transformation costs. This Deloitte Research illustrates the advantages that Serverless computing can bring about to retail operations.

To enhance their cloud native strategies, to further encourage citizen development, reducing over reliance on IT and bridging the IT-Business Gap, organisations are also making use of Low Code No Code (LCNC) tools and assembling, clearly shifting from application development. Citizen developers are making use of LCNC functionalities such as Drag and drop, pre-built User Interfaces, APIs and Connectors, one-click delivery and others to further augment their containerisation and microservices strategies. This Gartner Research predicts that 70% of new applications developed by organisations will use LCNC by 2025, well up from less than 25% in 2020.

Infrastructure and Data Management in the cloud are being immensely powered up by Automation and Orchestration. Especially in minimising manual efforts and errors in processes such as provisioning, configuring, sizing and auto-scaling, asset tagging, clustering and load balancing, performance monitoring, deploying, DevOps and CI/ CD testing and performance management. Further efficiencies are brought to fruition through automation especially in areas such as shutting down unutilised instances, backups, workflow version control, and establishing Infrastructure as Code (IAC). This further hence value-adds to robust cloud native architecture by enhancing containerisation, clustering, network configuration, storage connectivity, load balancing and managing the workload lifecycle, besides highlighting vulnerabilities and risks. Enterprises pursuing hybrid cloud strategies are hence driving automation in private clouds as well as integrating with public clouds by creating automation assets that perform resource codification across all private and public clouds and offer a single API. This McKinsey research highlights that companies that have adopted end-to-end automation in their cloud platforms and initiatives report a 20-40% increase in speed of releasing new capabilities to market. A similar report by Deloitte, mentions that intelligent automation in the cloud enables scale in just 4-12 months, compared to the earlier 6-24 months period, through streamlined development and deployment processes.

CIOs are also increasingly turning to distributed cloud models to address edge or location-based cloud use cases especially across Banks and Financial Institutions, Healthcare, Smart cities, and Manufacturing. It is expected that decentralised and distributed cloud computing will move from the initial private cloud substation deployments to an eventually Wi-Fi like distributed cloud substation ecosystems, especially considering the necessary availability, bandwidth and other operational and security aspects.

These rapid developments in the cloud ecosystems especially for hybrid and multi cloud environments have necessitated infrastructure and data management to encompass dashboards for end-to-end visibility of all the cloud resources and usage across providers, business functions, and departments. Governance and Compliance, Monitoring, Inventory Management, Patches and Version Control, Disaster Recovery, Hybrid Cloud Management Platforms (HCMP), Cloud Service Brokers (CSB) and other Tools aid companies in better Infrastructure Management in the Cloud, while catering to fluctuating demands and corresponding under and over utilisation scenarios, while continuously identifying pockets for optimisation and corrections. For companies with customers spread across diverse geographies, it is important to have tools for infrastructure management, global analytics, database engines and application architectures across these global Kubernetes clusters and Virtual Machines.

The vast increase in attack surfaces and potential breach points have necessitated CIOs and CISOs to incorporating robust security principles and tools within the Cloud Native ecosystem itself, through Cloud Security Platforms such as Cloud Access Security Broker (CASB), Cloud security posture management (CSPM), Secure Access Service Edge (SASE), DevSecOps and incorporation of AI and ML in their proactive threat hunting and response systems. This is also critical in adhering to Governance, Risk and Compliance (GRC) and Regulatory compliances, in with Zero Trust Architecture and Cyber Resilient frameworks and strategy. This McKinsey article highlights the importance of Security as Code (SaC) in cloud native strategies and its reliance on architecture and the right automation capabilities.

This EY article highlights the importance of cybersecurity in cloud native strategies as well as the corresponding considerations in processes, cyber security tools, architecture, risk management, skills and competencies and controls. Data Encryption and Load Protection, Identity and Access management, Extended Threat and Response Systems (XDR), Security Incident and Environment Management (SIEM), and Security Orchestration and Response (SOAR) tools that incorporate AI/ ML capabilities ensure more of proactive vis--vis a reactive response. Considering the vast information to ingest, store and analyse, organisations are also considering/ deploying Cyber Data Lakes as either alternatives or in conjunction to complement their SIEM ecosystems.

There is an increasing popularity of Financial Operations (FinOps) which is helping organisations to gain the maximum value from the cloud, through the cross functional involvement of business, finance, procurement, supply chain, engineering, DevOps/ DevSecOps and cloud operational teams. Augmented FinOps has been listed by Gartner in the 2022 Hype Cycle for emerging technologies here. FinOps immensely value-adds to infrastructure and data management in the cloud through dynamic and continuous sourcing and managing cloud consumption, demand mapping, crystallising the Total Cost of Ownership and Operations with end-to-end cost visibility and forecasting to make joint decisions and monitor comprehensive KPIs. Besides the cloud infrastructure management strategies listed in this section, FinOps also incorporates vendor management strategies and leveraging cloud carbon footprint tools for their organisations Sustainability Goals.

What about Data Management through the Cloud?

The 2nd half of the 2010s and especially the COVID-19 period have also resulted in an implosion of IoT, social media, E-commerce and other Digital Transformation. This has made organisations deal with diverse data sources residing on cloud, on-premise and on the edge, diversity in data sets across sensor, text, image, Audio-Visual, Voice, E-commerce, social media and others, and the volume of data that is now required to be ingested, managed and delivered on real-time and batch mode. Even before the pandemic, this implosion of unstructured data, necessitated companies to leverage Hadoop and other Open Source based Data Lakes besides their structured data residing in Data Warehouses. According to this Deloitte Article, for their surveyed CXOs, Data Modernisation is even a more critical aspect than cost and performance consideration for migrating to the cloud.

This research by Statista estimated the total worldwide data amount rose from 9 Zettabytes in 2013 to over 27 Zettabytes in 2021, and the prediction is this growing to well over 180 Zettabytes in 2025. Decentralised and distributed cloud computing, Web 3.0, the Metaverse and rise in Edge Computing will further contribute to this data growth.

Many organisations are looking at the Cloud as the future of data management as this article by Gartner states. As the cloud encompasses more and more data sources, this becomes more pivotal for data architects to have a deeper understanding of metadata and schema, the end-to-end data lifecycle pipeline of ingestion, cleaning, storage, analysis, delivery and visualisation, APIs, cloud automation and orchestration, Data Streaming, AI/ ML models, Analytics, Data Storage and Visualisation, as well as Governance and Security.

Data Architects are hence leveraging cloud computing in their strategies including scaling, elasticity and decoupling, ensuring high availability and optimal performance with relation to bursts and shutdowns, while optimising cost at the same time. The Data Teams are also deploying Automated and Active Cloud Data management covering classification, validation and governance with extensible and decoupling. There is also careful consideration for ensuring security for data at rest and in motion, as well as seamless data integration and sharing

It is also important to choose the right MetaData Strategy and consider options of Tiered apps with push-based services, pull-based ETL, and Event based Metadata. It is also worthwhile to stress upon the importance of having a robust Data Architecture/ DataOps culture as well, especially considering business and technology perspectives of the end-to-end data lifecycle right from data sources and ingestion, meta data and active data management, streaming, storage, analytics and visualisation. Deploying elasticity, AI/ ML and automation bring about immense benefits to the cloud native strategies.

Considering these aspects in Data Management, organisations have looking at ML and API powered Data Fabrics along with Data Lakes, Warehouses and Layers to manage this end-to-end data lifecycle by creating, maintaining and providing outputs to the consumers of this data, as this Gartner article on technology trends for 2022 highlights.

This article by McKinsey summarises the major pivot points in the data architecture ethos which are fundamentally based on Cloud with containerization and serverless data. These cover hybrid real time and batch data processing, shift from end-to-end COTS applications to modular best in function/ industry, move to APIs and decoupling, shift from centralised Data Warehousing to domain-based architecture and lastly from proprietary predefined datasets to data schema that is light and flexible, especially the NoSQL family.

For BFSI, Telecoms and other industry verticals which need customer data to reside locally, CXOs have been deploying Hybrid Data Management environments, that leverage Cloud Data Management tools to also automate, orchestrate, and re-use the on-premise data, thus providing a unified data model and access interface to both cloud and on-premise datasets.

Application of Automation and Orchestration in Data Storage also ensures prioritisation of processes, tasks and resources to balance speed, efficiency, usage and cost along with eliminating security vulnerabilities. This is especially applicable for tasks such as provisioning and configuration, capacity management, workflows and data migration, resource optimisation, software updates and data protection and disaster recovery. This World Economic Forum report right before the pandemic highlighted the fact that the conventional optical/ magnetic storage systems will be unable to handle this phenomenon for more than a century. CIOs and Leaders are hence leveraging automation and cloud, Storage-as-a Service (STaaS), decentralised Blockchain powered data storage and storage on the Edge, besides alternates to conventional electromagnetic/ optical data storage mechanism

What is the role of people and culture in this cloud powered data and infrastructure management ecosystem?

People, talent pool and organisation culture play a pivotal part in successful FinOps, cloud native and cloud data management strategies. In this dynamic and uncertain world, it is of paramount importance to have uniformity, alignment and resonance of business KPIs to best practices for Enterprise and Data Architecture, DevOps as well as those of Engineering, Finance, and Procurement. This environment of continuous evolution, and optimisation can be only brought about by an ethos of Communication, Trust, Change Management, Business-Finance-IT alignment, which are equally important cloud native strategies, Architecture, DevOps, DataOps, Security and other Engineering Talent Pools.

The continuing trends of the Great Resignation, Quiet Quitting and Moonlighting necessitate a combination of having the best employee and vendor engagement strategies, a readily available talent pool of architects, analysts, engineers and other skillsets, as well as upskilling.

Winding up?

The Cloud has deeply impacted and revitalised Infrastructure and Data Management in all aspects in the workplace. As per this Deloitte research, it is ideal to leverage an equal mix of people, tools and approaches to address cloud complexity, and have a powerful, agile, elastic, secure and resilient virtual business infrastructure deriving maximum value from the cloud.

Cloud-centric digital infrastructure is a bedrock in the post COVID world, aligning technology with business to support digital transformation, resilience, governance along with business outcomes through a combination of operations, technology and deployment as mentioned in this IDC paper. This is so important in the increasing complexity of todays world across Public Cloud Infrastructure, On-Premises and on the Edge.

With continuing business uncertainty, competitiveness, customer, supplier and employee pressures and stringent IT budgets, organisations are looking at the Cloud to revitalise their Infrastructure and Data Management and gain maximum value.

See the article here:

Revitalising data and infrastructure management through cloud - ETCIO South East Asia

High-Performance Computing (HPC) Market is expected to generate a revenue of USD 65.12 Billion by 2030, Globally, at 7.20% CAGR: Verified Market…

The growing demand for high-efficiency computing across a range of industries, including financial, medical, research, government, and defense, as well as geological exploration and analysis, is a significant growth driver for the HPC Market.

JERSEY CITY, N.J., Oct. 17, 2022 /PRNewswire/ -- Verified Market Research recently published a report, "High-Performance Computing (HPC) Market" By Component (Solutions, Services), By Deployment Type (On-Premise, Cloud), By Server Price Band (USD 250,000500,000 And Above, USD 250,000100,000 And Below), By Application Area (Government And Defense, Education And Research), and By Geography.

According to the extensive research done by Verified Market Research experts, the High-Performance Computing (HPC) Market size was valued at USD 34.85 Billion in 2021 and is projected to reach USD 65.12 Billion by 2030, growing at a CAGR of 7.20% from 2023 to 2030.

Download PDF Brochure: https://www.verifiedmarketresearch.com/download-sample/?rid=6826

Browse in-depth TOC on "High-Performance Computing (HPC) Market"

202 - Pages126 Tables37 Figures

Global High-Performance Computing (HPC) Market Overview

High-performance computing is the use of parallel processing to efficiently, consistently, and quickly operate big software programmes. High-Performance Computing is a technique that makes use of a sizable amount of computing power to offer high-performance capabilities for resolving various issues in the fields of engineering, business, and research. HPC systems refer to all types of servers and micro-servers used for highly computational or data-intensive applications. High-performance computing systems are those that can do 1012 floating-point operations per second, or one teraflop, on a computer.

One of the main factors influencing the growth of the High-Performance Computing (HPC) Market is the ability of HPC solutions to swiftly and precisely process massive volumes of data. The increasing demand for high-efficiency computing across a range of industries, including financial, medical, research, exploration and study of the earth's crust, government, and defense, is one of the primary growth factors for the high-performance computing (HPC) market.

The rising need for high precision and quick data processing in various industries is one of the major drivers of the High-Performance Computing (HPC) Market. The market will also grow exponentially over the course of the anticipated period as a result of the growing popularity of cloud computing and the government-led digitization initiatives.

The usage of HPC in cloud computing is what is driving the worldwide high-performance computing (HPC) market. Utilizing cloud computing platforms has a number of benefits, including scalability, flexibility, and availability. Cloud HPC offers a number of benefits, including low maintenance costs, adaptability, and economies of scale.

Additionally, HPC in the cloud gives businesses that are new to high-end computing the chance to form a larger community, enabling them to profit from cheap operating expenditure (OPEX) and overcome the challenges of power and cooling. As a result, it is anticipated that the growing usage of HPC in the cloud would significantly influence the High-Performance Computing (HPC) Market.

Key Developments

Partnerships, Collaborations, and Agreements

Mergers and Acquisitions

Product Launches and Product Expansions

Key Players

The major players in the market are Advanced Micro Devices Inc., Hewlett Packard Enterprise, Intel Corporation, International Business Machines (IBM) Corporation, NEC Corporation, Sugon Information Industry Co. Ltd, Fujistu Ltd, Microsoft Corporation, Dell Technologies Inc., Dassault Systemes SE, Lenovo Group Ltd, Amazon Web Series, and NVIDIA Corporation.

Verified Market Research has segmented the Global High-Performance Computing (HPC) Market On the basis of Component, Deployment Type, Server Price Band, Application Area, and Geography.

Browse Related Reports:

Enterprise Quantum Computing Market By Component (Hardware, Software, Services), By Application (Optimization, Simulation And Data Modelling, Cyber Security), By Geography, And Forecast

Healthcare Cognitive Computing Market By Technology (Natural Language Processing, Machine Learning), By End-Use (Hospitals, Pharmaceuticals), By Geography, And Forecast

Cognitive Computing Market By Component (Natural Language Processing, Machine Learning, Automated Reasoning), By Deployment Model (On-Premise, Cloud), By Geography, And Forecast

Cloud Computing In Retail Banking Market By Product (Public Clouds, Private Clouds), By Application (Personal, Family, Small and Medium-Sized Enterprises)

Top 10 Edge Computing Companies supporting industries to become 100% self-reliant

Visualize High-Performance Computing (HPC) Market using Verified Market Intelligence -:

Verified Market Intelligence is our BI Enabled Platform for narrative storytelling in this market. VMI offers in-depth forecasted trends and accurate Insights on over 20,000+ emerging & niche markets, helping you make critical revenue-impacting decisions for a brilliant future.

VMI provides a holistic overview and global competitive landscape with respect to Region, Country, Segment, and Key players of your market. Present your Market Report & findings with an inbuilt presentation feature saving over 70% of your time and resources for Investor, Sales & Marketing, R&D, and Product Development pitches. VMI enables data delivery In Excel and Interactive PDF formats with over 15+ Key Market Indicators for your market.

About Us

Verified Market Research is a leading Global Research and Consulting firm servicing over 5000+ customers. Verified Market Research provides advanced analytical research solutions while offering information enriched research studies. We offer insight into strategic and growth analyses, Data necessary to achieve corporate goals and critical revenue decisions.

Our 250 Analysts and SME's offer a high level of expertise in data collection and governance use industrial techniques to collect and analyze data on more than 15,000 high impact and niche markets. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise and years of collective experience to produce informative and accurate research.

We study 14+ categories from Semiconductor & Electronics, Chemicals, Advanced Materials, Aerospace & Defense, Energy & Power, Healthcare, Pharmaceuticals, Automotive & Transportation, Information & Communication Technology, Software & Services, Information Security, Mining, Minerals & Metals, Building & construction, Agriculture industry and Medical Devices from over 100 countries.

Contact Us

Mr. Edwyne FernandesVerified Market ResearchUS: +1 (650)-781-4080UK: +44 (753)-715-0008APAC: +61 (488)-85-9400US Toll Free: +1 (800)-782-1768Email: [emailprotected]Web: https://www.verifiedmarketresearch.com/Follow Us: LinkedIn | Twitter

Logo: https://mma.prnewswire.com/media/1315349/Verified_Market_Research_Logo.jpg

SOURCE Verified Market Research

Continued here:

High-Performance Computing (HPC) Market is expected to generate a revenue of USD 65.12 Billion by 2030, Globally, at 7.20% CAGR: Verified Market...

7 Best Quantum Computing Stocks to Buy in 2022 | InvestorPlace

Quantum computing offers the potential to harness big data, make intricate predictions and use artificial intelligence (AI) to revolutionize business operations. Many industries such as automotive, agriculture, finance, healthcare, energy, logistics and space will be affected from the growth in this technology. As a result, Wall Street has been paying significant attention to quantum computing stocks.

Once considered science fiction, quantum computing has made significant progress in recent years to solve complex problems at lightning speed. This advanced technology uses the power of quantum mechanics to represent complex problems. These computers can take seconds to calculate equations that normally take days for machines that use a binary framework.

International Data Corporation forecasts that the global market for quantum computing should grow from about $412 million in 2020 to more than $8.5 billion in 2027. This increase would mean a compound annual growth rate (CAGR) of an eye-popping 50% between now and 2027. Given such metrics, its understandable why investors are thrilled about the future of quantum computing stocks.

While it is currently in its early days, Wall Street has already warmed up to long-term prospects of this technology. Besides several pure-play quantum computing stocks going public in 2021, well-known tech names are pouring significant research dollars to invest in this advanced segment.

With that information, here are the seven best quantum computing stocks to buy in 2022:

52-week range: $142.25 $191.95

Dividend yield: 1.7%

Semiconductor group Analog Devices manufactures integrated circuits that process analog and digital signals. ADIs chips are used in data converters, high-performance amplifiers and microwave-integrated circuits.

Analog Devices issuedQ4 2021 metricson Nov. 23. Revenue increased 53% year-over-year (YOY) to $2.34 billion. Adjusted earnings soared from $1.44 per share to $1.73 per share. The company generated a free cash flow of $810 million. Cash and equivalents ended the period at $1.98 billion.

Factory automation has fueled demand for sensors and machine connectivity, which increasingly rely on Analogs chips. In addition, the automotive industry has also become a key growth driver due to the rising use of advanced electronics in electric vehicles (EVs).

In late August, the chipmaker completed theacquisition of Maxim Integrated. The billion transaction should increase ADIs market share in automotive and 5G chipmaking.

ADI currently trades just under $160, up 7% over the past 12 months. Shares are trading at 21.5 times forward earnings and 8.9 times trailing sales. The 12-month median price forecast for Analog Devices stock stands at $210.

52-Week Range:$42.96 $57.15

Expense Ratio:0.40% per year

QTUM is an exchange-traded fund (ETF) that focuses on the next generation of computing. It offers exposure to names leading the way in quantum computing, machine learning and cloud computing. The fund tracks the BlueStar Quantum Computing and Machine Learning Index.

QTUM, which started trading in September 2018, has 71 holdings. The top 10 holdings account for less than 20% of net assets of $161.5 million. Put another way, fund managers are not taking major bets on any company.

Among the leading holdings on the roster are the security and aerospace company Lockheed Martin(NYSE:LMT), French telecommunications operator Orange(NYSE:ORAN) and IBM.

For most retail investors, QTUM could potentially be a safe and diversified place to start investing in quantum computing. As portfolio companies come from a wide range of technology segments, wide swings in the price of one stock will not affect the ETF significantly.

The fund has gained 8.7% over the past year and saw an all-time high in November 2021. However, the recent selloff in tech stocks led to a 10.7% decline year-to-date (YTD). Interested readers could regard this decline as a good entry point into QTUM.

52-week range: $113.17 $146.12

Dividend Yield: 4.8%

Technology giant International Business Machines (IBM) needs little introduction. The legacy tech name offers integrated solutions and services, including infrastructure, software, information technology (IT) and hardware.

IBM announcedQ4 2021 financials on Jan. 24. The company generated revenues of $16.7 billion, up 6.5% YOY. Net income stood at $2.33 billion, or $2.57 per diluted share, up from $1.36 billion, or $1.51 per diluted share, in the prior-year quarter. Cash and equivalents ended the period at $6.65 billion.

After the announcement, CEO Arvind Krishna said, We increased revenue in the fourth quarter with hybrid cloud adoption driving growth insoftware and consulting.

The company launched its Quantum System One quantum computer in 2019. Around 150 research groups and partner companies currently use IBMs quantum computing services. These names come from financial services businesses, automakers and energy suppliers.

In June 2021, IBMunveiledEuropes most powerful quantum computer in Germany. Moreover, the tech giant recentlyannounced a deal with Raytheon Technologies(NYSE:RTX) to provide quantum computing and AI services for the aerospace, defense and intelligence industries.

IBM currently changes hands around $137, up 20% over the past 12 months. Shares are trading at 13.5 times forward earnings and 2.2 times trailing sales. The 12-month median price forecast for IBM stock is $144.50. Interested readers could consider buying IBM shares around these levels.

52-week range: $7.07 $35.90

IonQ is one of the first publicly traded pure-play quantum computing stocks. It went public via a merger with the special purpose acquisition company (SPAC) dMY Technology Group III in late 2021.

The quantum name released Q3 2021 results on Nov. 15. Its net loss was $14.8 million, or 12 cents loss per diluted share, compared to a net loss of $3.6 million a year ago. Cash and equivalents ended the quarter at $587 million. Wall Street was pleased that at the time, YTD contract bookings came in at $15.1 million.

IonQ is currently developing a network of quantum computers accessible from various cloud services. The technology uses ionized atoms that allow IonQs machines to perform complex calculations with fewer errors than any other quantum computer available.

The start-up has the financial backing of prominent investors, including Bill Gates and the Japanese telecommunications companySoftbank Group(OTCMKTS:SFTBF). In addition, IonQ has been developing strategic partnerships with Microsoft, Amazons (NASDAQ:AMZN) Amazon Web Services and Alphabets(NASDAQ:GOOG, NASDAQ:GOOGL) GoogleCloud.

While IonQ is taking steps to become a commercialization-stage name, it is still a speculative investment. With its potential for explosive growth, it could be an attractive quantum computing stock for investors looking to take a risk.

IONQ stock hovers around $12. The recent selloff in tech stocks has led to a 26.7% decline YTD. Yet, the 12-month median price forecast for IONQ stock stands at $23.

52-week range: $224.26 $349.67

Dividend Yield: 0.8%

Microsoft is one the largest and most prominent technology firms worldwide. It offers software products and services, including Azure cloud service, the Office 365 productivity suite and the customer relationship management (CRM) platform Dynamics 365.

Meanwhile, the Microsoft Quantum is the worlds first full-stack, open cloud quantum computing ecosystem that allows developers to create quantum applications and run them on multiple platforms. The software giant provides quantum computing services via the cloud on Azure.

Management announced robust Q2 FY22 metricson Jan. 25. Revenue increased 20% YOY to $51.7 billion. Net income surged 21% YOY to $18.8 billion, or $2.48 per diluted share, compared to $15.5 billion, or $2.03 per diluted share, in the prior-year quarter. Cash and equivalents ended the period at $20.6 billion.

On Jan. 18, Microsoft announced plans to acquire Activision Blizzard(NASDAQ:ATVI), a leading player in game and interactive entertainment development. It will be an all-cash transaction valued at $68.7 billion. Wall Street expects this deal to provide tailwinds for Microsofts gaming business and building blocks for the metaverse.

MSFT stock currently trades just under $310, up 27% over the past 12 months. Shares support a valuation of 32.6 times forward earnings and 12.3 times trailing sales. And the 12-month median price forecast for Microsoft stock stands at $370.

52-week range: $115.67 $346.47

Dividend Yield: 0.06%

Santa Clara, California-based Nvidia has become an important name in advanced semiconductor design and software for next-generation computing development. InvestorPlace readers likely know the chipmaker is a market leader in the gaming and data center markets.

Nvidia announced impressive Q3 FY 2022 numbers on Nov. 17. Revenue soared 50% YOY to a record $7.1 billion, fueled by record sales in the gaming and data center businesses. Net income increased 62% YOY to $2.97 billion, or $1.17 per diluted share. Cash and equivalents ended the period at $1.29 billion.

The chipmaker provides the necessary processing power that drives the development of quantum computing. Additionally, Nvidia recently released cuQuantum, a software development kit designed for building quantum computing workflows. It has partnered with Google, IBM and other quantum computing players that rely on cuQuantum to accelerate their quantum computing work.

Given its growing addressable market in cloud computing, gaming, AI, and more recently the metaverse, NVDA stock deserves your attention. Share are changing hands around $245, up nearly 80% over the past year. However, despite an 18% decline YTD, shares are trading at 46.5 times forward earnings and 25 times trailing sales.

Finally, the 12-month median price forecast for Nvidia stock is $350. As the company gets ready to report earnings soon, investors should expect increased choppiness in price.

52-week range: $9.62 $12.75

Our final stock is Supernova Partners Acquisition II, a SPAC. It is merging with Rigetti Computing, a start-up focused on quantum computer development. As a result of the merger, Rigetti Computing was valued at about $1.5 billion and received $458 million in gross cash proceeds.

Rigetti designs quantum chips and then integrates those chips with a controlling architecture. It also develops software used to build algorithms for these chips.

Rigetti recently announced business highlightsfor the nine months ended Oct. 31, 2021. Revenue came in at $6.9 million. Net operating loss declined 3% YOY to $26.2 million.

We believe the time for quantum computing has arrived, said founder and CEO Chad Rigetti. Customer demand is increasing as Rigetti quantum computers begin to address high-impact computational problems.

The start-up launched the worlds first scalable multi-chip quantum processor in June 2021. This processor boasts a proprietary modular architecture. Now Wall Street expects the company to move toward commercialization.

Rigetti collaborates with government entities and technology to advance its quantum processors. For instance, it boasts strategic partnerships with the National Aeronautics and Space Administration (NASA) and the U.S. Department of Energy. It also works with data analytics firm Palantir Technologies (NYSE:PLTR) and electronics manufacturer Keysight Technologies(NYSE:KEYS).

SNII stock is currently shy of $10, down about 4% YTD. As investors interest in quantum computing names grow, shares are likely to become hot.

On the date of publication, Tezcan Gecgil holds both long and short positions in NVDA stock. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

TezcanGecgil has worked in investment management for over two decades in the U.S. and U.K. In addition to formal higher education in the field, she has also completed all 3 levels of the Chartered Market Technician (CMT) examination. Her passion is for options trading based on technical analysis of fundamentally strong companies. She especially enjoys setting up weekly covered calls for income generation.

More here:
7 Best Quantum Computing Stocks to Buy in 2022 | InvestorPlace

Strategic Partnership Agreement to Develop the Quantum Computing Market in Japan and Asia-Pacific – PR Newswire

TOKYO, CAMBRIDGE, England and BROOMFIELD, Colo., Oct. 18, 2022 /PRNewswire/ -- Mitsui & Co., Ltd ("Mitsui") and Quantinuum have signed a strategic partnership agreement to collaborate in the delivery of quantum computing in Japan and the Asia-Pacific region.

Mitsui, which is committed to digital transformation, and Quantinuum, one of the world's leading quantum computing companies, integrated across hardware and software, have entered this strategic partnership to develop quantum computing use cases, which are expected to drive significant business transformation and innovation in the future.

Mitsui and Quantinuum will accelerate collaboration, cooperation, and development of new business models. They will jointly pursue quantum application development and provide value added services to organizations working across a variety of quantum computing domains, which is expected to be worth US$450B US$850B worldwide by 2040.*

Yoshio Kometani, Representative Director, Executive Vice President and Chief Digital Information Officer of Mitsui & Co., Ltd. stated:"We are very pleased with the strategic partnership between Mitsui and Quantinuum. By combining Quantinuum's cutting-edge quantum computing expertise and diverse quantum talents with Mitsui's broad business platform and network, we will work together to provide new value to our customers and create new business value in a wide range of industrial fields."

Ilyas Khan, Founder and CEO of Quantinuum stated:"The alliance between Mitsui and Quantinuum demonstrates our shared commitment to accelerating quantum computing across all applications and use cases in a diverse range of sectors, including chemistry, finance, and cybersecurity. Today's announcement reinforces our belief in the global quantum leadership shown by corporations and governments in Japan, pioneered by corporate leaders like Mitsui."

Details of the Strategic Partnership

Collaboration areas and applications

Recent Achievements by Quantinuum

About Mitsui & Co., Ltd.

Location: 1-2-1 Otemachi, Chiyoda-ku, Tokyo

Established: 1947

Representative: Kenichi Hori, President and Representative Director

Mitsui & Co., Ltd. (8031: JP) is a global trading and investment company with a diversified business portfolio that spans approximately 63 countries in Asia, Europe, North, Central & South America, The Middle East, Africa and Oceania.

Mitsui has about 5,500 employees and deploys talent around the globe to identify, develop, and grow businesses in collaboration with a global network of trusted partners. Mitsui has built a strong and diverse core business portfolio covering the Mineral and Metal Resources, Energy, Machinery and Infrastructure, and Chemicals industries.

Leveraging its strengths, Mitsui has further diversified beyond its core profit pillars to create multifaceted value in new areas, including innovative Energy Solutions, Healthcare & Nutrition and through a strategic focus on high-growth Asian markets. This strategy aims to derive growth opportunities by harnessing some of the world's main mega-trends: sustainability, health & wellness, digitalization and the growing power of the consumer.

Mitsui has a long heritage in Asia, where it has established a diverse and strategic portfolio of businesses and partners that gives it a strong differentiating edge, provides exceptional access for all global partners to the world's fastest growing region and strengthens its international portfolio.

For more information on Mitsui & Co's businesses visit, https://www.mitsui.com/jp/en/index.html

About Quantinuum

Location: Cambridge, U.K., Broomfield, Colorado, U.S.A.

Established: December 2021 (through the merger of Honeywell Quantum Solutions (U.S.) and Cambridge Quantum Computing (U.K.))

Representative: Ilyas Khan, CEO; Tony Uttley, COO; Shuya Kekke, CEO & Representative Director, Japan

Quantinuum is one of the world's largest integrated quantum computing companies, formed by the combination of Honeywell Quantum Solutions' world-leading hardware and Cambridge Quantum's class-leading middleware and applications. Science-led and enterprise-driven, Quantinuum accelerates quantum computing and the development of applications across chemistry, cybersecurity, finance, and optimization. Its focus is to create scalable and commercial quantum solutions to solve the world's most pressing problems in fields such as energy, logistics, climate change, and health. The company employs over 480 individuals, including 350 scientists, at nine sites across the United States, Europe, and Japan.

Selected major customers (in Japan): Nippon Steel Corporation, JSR Corporation

http://www.quantinuum.com

Photo - https://mma.prnewswire.com/media/1923231/Quantinuum.jpgPhoto - https://mma.prnewswire.com/media/1923232/Quantinuum_System_Model.jpg

SOURCE Quantinuum LLC

See the article here:
Strategic Partnership Agreement to Develop the Quantum Computing Market in Japan and Asia-Pacific - PR Newswire

Quantum Leap: "The big bang of quantum computing will come in this decade" – CTech

In the few images that IBM has released, its quantum computing lab looks like the engine room of a spaceship: bright white rooms with countless cables dangling from the ceiling down to a floating floor, pierced with vents. This technological tangle is just the background for the main show: rows of metal supports on which hang what look like... white solar boilers.

There, within these boilers, a historical revolution is taking shape. IBM, a computing dinosaur more than a century old, is trying to reinvent itself by winning one of the most grueling, expensive and potentially promising scientific races ever: the race to develop the quantum computer. "We are living in the most exciting era in the history of computing," says Dario Gil, Senior Vice President of IBM and head of the company's research division, in an exclusive interview with Calcalist. "We are witnessing a moment similar to the one recorded in the 40s & 50s of the last century, when the first classic computers were built." A few weeks after this conversation, his statements were further confirmed, when the Nobel Prize Committee announced the awarding of the prize in the field of physics to three researchers whose research served as a milestone in the development of the field.

The name Dario Gil shakes a lot of quanta and cells in the brains, and maybe even in the hearts, of physicists and computer engineers all over the world. This is the person who leads the most advanced effort in the world to develop a quantum computer. In September, when Gil landed in Tel Aviv for a short visit to give the opening lecture at the IBM conference, the hall was packed with senior engineers, researchers from the top universities in Israel, and representatives of government bodies - all enthralled by what Gil had to say.

2 View gallery

Dario Gil.

(Photo: Elad Gershgoren)

Gil (46) was born in Spain and moved to the United States to study at MIT University. He completed his doctoral studies there, and immediately after graduation began working at IBM in a series of research and development positions. Since 2019, he has been leading the company's research division, which has 3,000 engineers at 21 sites, including Israel. Under his management, in 2016, IBM built the first quantum computer whose services are available to anyone: if you have a complicated question, you can go to the IBM Quantum Experience website, remotely access one of the quantum computers through the cloud - and, perhaps, receive an answer. But as with everything related to quantum computing, it just sounds simple.

"Quantum computing is not just a name for an extremely fast computer," says Gill. In fact, he explains, the quantum computer is no longer a supercomputer that uses the same binary method that is accepted in every classical computer, but a completely new machine, another step in the evolution leading from strings of shells, through beaded invoices and calculating bars, to gear-based mechanical computers, to the electronic computer and now to the quantum computer. "Essentially, the quantum computer is a kind of simulator of nature, through which it is possible to simulate natural processes, and thus solve problems that previously had no solution," explains Gil. "If the classical computer is a combination of mathematics and information, then quantum computing is a combination of physics and information."

This connection makes it possible to solve certain types of problems with unprecedented speed: Google, which is also developing a quantum computer, claimed in 2019 that it had reached "quantum supremacy" a demonstration of a calculation that a quantum computer would perform more efficiently than a classical computer. The researchers at Google showed how a quantum computer performed in 200 seconds a calculation that they claim would have required a classical computer ten thousand years to complete. This claim has since been disproved by other researchers, who have presented an algorithm that allows a classical computer to perform the same calculation in a reasonable amount of timebut even this Google failure provides an idea of the enormous power a quantum computer will have.

"The quantum computer does not make the classical computer superfluous: they will live together, and each of them will solve different problems," explains Gil. "It's like asking you how to get from point A to point B: you can walk, ride a bicycle, travel by car or fly. If the distance between these points is 50 km, you won't fly between them, right? Accordingly, it is a mode suitable for a classic computer. A quantum computer allows you to fly, even to the moon, and quickly."

You will soon explain to me how it works, and in which areas exactly, but before that, let's start from the bottom line: what can we do with it?

"Quantum computing will make it possible to crack a series of problems that seemed unsolvable, in a way that will change the world. Many of these issues are related to energy. Others are related to the development of new and exciting materials. We tend to take the materials available to us for granted, but in the past there were eras that were defined by the materials that dominated them - The Stone Age', the 'Bronze Age', the 'Iron Age'. Quantum computing will help us develop materials with new properties, therefore the first sector that is already using it is industry, especially the car industry: the car manufacturers are interested in better chemistry, which will enable the production of more efficient and durable batteries for electric vehicles. For a normal computer this is a huge task, and to complete it we have to give up accuracy and settle for approximate answers only, but quantum computing can help quickly develop materials that will fit the task, even without entering the lab. The efficiency of a quantum computer when it comes to questions in chemistry is also used in the pharmaceutical industry, There they are beginning to make initial use of such computers to examine the properties of molecules, and in this way to speed up the development of new drugs; and also in the fertilizer industry, which will be able to develop substances whose production will not harm the environment.

The uses are not limited to the material world. "For the financial sector, for example, the quantum computer enables the analysis of scenarios, risk management and forecasting, and the industry is already very interested in such possible applications, which could provide the general public with dramatically improved performance in investment portfolios, for example.

2 View gallery

IBM.

(Photo: Shutterstock)

At the same time, there are industries that quantum computing will force to recalculate their course, and the information security industry is at the forefront. The modern encryption systems (mainly RSA, one of whose developers is the Israeli Prof. Adi Shamir) are asymmetric: each recipient publishes a code that allows the information sent to them to be encrypted ("public key"), which includes the product of two large prime numbers that are kept secret. To decipher the encrypted information, this product must be broken down into factors - but without knowing what the initial numbers are, "this task would require a normal computer to calculate for many years," explains Gil. "However, for the quantum computer, such a calculation can be a matter of seconds."

There is a real threat here to an entire industry, the logic behind which has been built since the 1970s, and now suddenly the ground is cracking under it.

"True, a normal computer needs ten thousand years to solve an encryption that a quantum computer would solve in an instant. That is why the quantum computer threatens the world of cyberspace and encryption, which are the basis of all global information security. This is an example that is not related to physics or nature, but simply to the stronger and faster computing power of the quantum computer.

The computer that works against all the rules of intuition

To understand the power of the quantum computer, this concept, "quantum computing", must first be broken down. The first step is to stop thinking in the familiar concepts of one and zero. Forget about bits and binaries. The key to understanding quantum computing is the recognition that this dichotomy is not there: instead of the bit, quantum computing relies on a basic unit of information called a qubit (short for "quantum bit"). The qubit is simultaneously one, zero and everything in between.

This is the moment to stop and explain the theory that underlies the quantum computer, and which seems to go against common sense. "Quantum theory makes it possible to explain the behavior of very, very small particles," Gil explains. "At school we are presented with a model of an atom that looks like a planet, with a nucleus and electrons moving around, but at the beginning of the 20th century, this model turned out to be not very accurate." This happened when physicists such as Max Planck and Albert Einstein realized that light, which until then physics saw as a wave, also behaves as a particle - and the energy of this particle can only be described in "quantum" jumps, that is, as discrete packets. In the decades that followed, this theory was developed more and more, and proved to be effective in describing a variety of phenomena in the world of particles. And yet, its deep meanings remain obscure even today.

Such is, for example, the idea that a particle is in more than one place. According to quantum theory, a particle moving between two points moves simultaneously in all the paths between them, a state called "superposition". It's not that we don't know its exact location: it just doesn't have one. Instead, it has a distribution of possible locations that coexist. In other words, reality is not certain, but probabilistic.

And this is not the only puzzle posed by quantum theory. Another confusing concept is "entanglement", a situation in which several particles exhibit identical physical values, and respond simultaneously to a change in one of them, even if they are at a great distance from each other. Gil suggests thinking of it as tossing two coins: anyone who has studied statistics knows that the probabilities of getting a "head" or a "tail" on each of them are independent. But in the quantum model, if the coins (representing particles here) are intertwined, then tossing one of them will result in the same result in the other. "Einstein didn't believe in interweaving, and hated these patterns," Gil says with a smile.

Measurements that affect the results? A reality that is not absolute but statistical? Particles that become twins even at infinite distance? If these ideas sound puzzling, incomprehensible or counter-intuitive to you, you are not alone: "Whoever comes across quantum theory and is not left stunned, has not understood it," said the physicist Niels Bohr, Einstein's contemporary and his great nemesis, who won the Nobel Prize for his contribution to the development of the theory (Einstein, by the way, had reservations about Bohr's interpretation of the theory's conclusions). Another physicist who won the Nobel Prize for his contribution to the theory, Richard Feynman, commented on this when he said: "If you think you have understood quantum theory, you have not."

The same Feynman is the father of quantum computing: he wanted to simulate the behavior of particles, but due to the probabilistic nature of the theory, a classical computer that would try to perform such a simulation would require an enormous amount of calculations, so that the simulation would become impractical. "Feynman, and like him other physicists, thought that the field of computing focused on mathematical horizons and moved too far away from nature, and that physics could be more connected to the world of information," explains Gil. "In a historic lecture he gave in 1981, Feynman claimed that there was nothing to give a classical computer to deal with particle simulation, because nature is not classical. He said, 'If we want to simulate nature, we need a machine that behaves like nature, in a quantum way.'" In 1998, this vision was realized, when the first quantum computer was built at the University of Oxford in Great Britain.

A quantum computer utilizes the enigmatic properties of quantum theory, those that are not fully understood by us, to perform calculation operations. In a normal computer, the basic unit of information is a "bit", which can have one of two values, 0 or 1; Using such bits makes it possible to perform any calculation imaginable - although some of these calculations may take a very long time. In a quantum computer, the qubit, thanks to superposition, represents not one absolute value, but a distribution of values. "You can think of it as a question of more dimensions: one and zero are just the ends, the poles of a coin for example, but it can also have a sideways tilt," explains Gil. Using statistical approaches it is possible to examine the state of the qubit and obtain useful results. This probabilistic approach is not suitable for every problem, but in solving certain problems it is infinitely more efficient than the classical computer's search for an absolute answer.

"Because of the entanglement effect, it is also possible to cause the qubits to influence each other," says Gil. And since each qubit represents an entire field of possibilities, each addition of a qubit increases the number of possible connections between the qubits with exponentially increasing power (in the classical computer, on the other hand, the addition of bits grows linearly). At the moment, IBM holds the record for qubits: last year it unveiled a quantum processor with 127 qubits, and its stated goal is to launch a processor with 433 qubits this year, and a processor with 1,021 qubits next year.

Three degrees colder than outer space

This ambition is more pretentious than it seems. It turns out that "building a machine that will behave like nature" is a complex story like no other: the qubits are very sensitive to outside influences, which makes building a computer a very complicated and expensive business. "The quantum computer is very powerful, but at the same time also very delicate," explains Gil: "It utilizes physical processes that occur in the world, but such processes are a system in which everything is connected, everything affects everything, and this can disrupt the results: if energy from the outside world goes inside and connect to the qubits, this will make them behave like normal bits, and thus the unique ability of quantum computation will be lost. Therefore, a quantum computer must be very isolated from the entire environment. The big challenge is to produce a system that is sufficiently isolated from the outside world, but not too isolated."

When I try to find out what the cost of building a quantum computer is - and IBM has already built 40 of them - Gil avoids a clear answer, but it is enough to hear what this effort entails: "There are several different approaches to building a quantum computer; IBM chose a cryogenic approach, meaning deep freezing, and the use of superconductors. The temperature in the computer is close to absolute zero: at the bottom of its case the temperature is minus 273 degrees Celsiusthree degrees less than the temperature of outer space, and less than one degree above absolute zero. The temperature should be close to absolute zero, but not reach it, because then there is no movement at all, Not even of the atoms."

The result is a cooling and protection case that resembles a water heater in its shape, and inside it has the calculation unit, whose shape gave it the nickname "chandelier" according to Gil and his team. "Inside the layers of protection there is a cylinder with the processor in it. Even if only a fraction of an energy particle enters the computer, literally a fraction of nothing, it will be enough to disrupt the results," Gil clarifies.

The great sensitivity, and the protection requirements derived from it, mean that the quantum computer is quite cumbersome: in the newest models, which try to include more and more qubits, the case already reaches a height of several meters. To some extent it is reminiscent of the first generations of classic computers, which looked like huge cabinets. Those classic computers kept getting smaller and smaller, until today we squeeze millions of times more computing power into a simple smartphone, but in the case of quantum computers, we cannot expect a similar process: "The quantum computer requires unique conditions that cannot be produced in a simple terminal device, and this will not change in the foreseeable future," Gil explains. "I believe that quantum computing will be a service that we can access remotely, as we access cloud services today. It will work similar to what IBM already enables today: the computer sits with us, and we make it possible to access the 'brain' and receive answers. Of the 40 computers we have built since 2016, today 20 are available to the public. About half a million users all over the world have already made use of the capabilities of the quantum computer we built, and based on this use, about a thousand scientific publications have already been published."

Google and Microsoft are heating up the competition

IBM is not the only company participating in the quantum computing race, but Gil exudes full confidence in its ability to lead it: according to him, most competitors only have parts of the overall system, but not a complete computer available to solve problems. Google, as mentioned, is a strong contender in this race, and it also allows remote access to its quantum computing service, Google Quantum AI; Microsoft is also working to provide a similar service on its cloud platform, Azure.

Meanwhile, quantum computing is a promise "on paper". The theoretical foundations for this revolution were laid already 40 years ago, the first proofs were presented more than 20 years ago, the industry has been buzzing around this field for several years - and we still haven't seen uses that would serve a regular person.

"If you go back to the 1940s, when the first computers were invented, you will see that even then the uses and advantages of the new invention were not clear. Those who saw the first computers said, 'Oh, great, you can use it to crack the code of encryption machines in wars, maybe even calculate routes of ballistic missiles, and that's it. Who's going to use it? Nobody,'" Gil laughs. "In the same way, the success of quantum computing will depend on its uses: how easy it will be to program, how large the community of users will be, what talents will get there. The quantum revolution will be led by a community, which is why education for this field is so important: we need more and more smart people to start to think 'how can I use quantum computing to advance my field'.

"What is beginning these days is the democratization phase of quantum computing, which will allow anyone to communicate with the computer without being an advanced programmer in the field: it will be possible to approach it with a question or a task that will be written in the classical languages of one or zero. That is why we are already seeing more use of quantum computing capacity today.

"There are also many startups that do not actually work to establish a quantum computer, but focus on various components of this world (for example, the Israeli company Quantum Machines, which develops hardware and software systems for quantum computers, and last July was selected by the Innovation Authority to establish the Israeli Quantum Computing Center). The activity of such companies creates a completely new ecosystem, thus promoting the industry and accelerating its development, just as is happening today in the field of ordinary computers. IBM will not rely only on itself either: we would like to benefit from the innovation of smart people in this field, of course also in Israel.

"I am convinced that the big bang of quantum computing will happen in this decade. Our ambition at IBM is to demonstrate 'quantum supremacy' already in the next three years. I believe that the combination of advances in artificial intelligence, together with quantum computing, will bring about a revolution in the industry of the kind that Nvidia made in its market (Nvidia developed unique processors for gaming computers, which made it the chip company that reached a billion dollar revenue the fastest.) Quantum computing can generate enormous value in the industry. It is phenomenally difficult, but it is clear to me that we will see the uses already in the current decade."

The Nobel Prize opens a new horizon for quantum computing

Quantum computing has ignited the imagination of researchers for many decades, but until now it has not left the confines of laboratories. However, the awarding of the Nobel Prize to three researchers in the field indicates that the vision is becoming a real revolution. Alain Aspect of France, the American John Clauser and Austrian Anton Zeilinger received the award for research they conducted (separately) since the 1970s, in which they examined the phenomenon of quantum entanglement (described in the article), proved its existence and laid tracks for its technological use.

The awarding of the Nobel Prize to the entanglement researchers proves that quantum computing is more than a mental exercise for a sect of physicists, and is a defining moment for companies that invest capital in the development of the field. They are pushed to this effort due to a fundamental change in the world in which they operate: in recent decades, the world of computing has operated according to "Moore's Law", which foresees that the density of transistors in computer processors will double every two years in a way that will increase the computing power of these chips. However, as the industry approaches the physical limit after which it will be impossible to cram more transistors onto a chip, the need to develop a quantum computer has become acute.

The numbers also signal that something is happening in the field. In 2020, the scope of the quantum computing market was less than half a billion dollars, but at the end of 2021, in a signal that the vision is beginning to be realized, the research company IDC published an estimate according to which in 2027 the scope of the market will reach $8.6 billion and investments in the field will amount to $16 billion (compared to $700 million in 2020 and $1.4 billion in 2021). IBM CEO Arvind Krishna also recently estimated that in 2027 quantum computing will become a real commercial industry.

Link:
Quantum Leap: "The big bang of quantum computing will come in this decade" - CTech

VW teams with Canadian quantum computing company Xanadu on batteries – Automotive News Canada

Quantum computing, Ardey added in a release, might trigger a revolution in material science that will feed into the companys in-house battery expertise.

Leaving the bits and bytes of classical computing behind, quantum computers rely on qubits, and are widely seen as having potential to solve complex problems that traditional computers could not work through on reasonable timelines.

The automaker and Toronto-based technology firm have already been collaborating on research into material science, computational chemistry, and quantum algorithms for about a year. That early work set the foundation for the formal partnership, Volkswagen said.

The goal of the research is to develop quantum algorithms that can simulate how a blend of battery materials will interact more quickly than traditional computer models. Computational chemistry, which is traditionally used for such work, Ardey said, is reaching limitations when it comes to battery research.

Juan Miguel Arrazola, head of algorithms at Xanadu, said the partnership is part of the Canadian companys drive to make quantum computers truly useful.

Focusing on batteries is a strategic choice given the demand from industry and the prospects for quantum computing to aid in understanding the complex chemistry inside a battery cell.

Using the quantum algorithms, Volkswagen said it aims to develop battery materials that are safer, lighter and cheaper.

Here is the original post:
VW teams with Canadian quantum computing company Xanadu on batteries - Automotive News Canada

Cleveland Clinic and IBM Begin Installation of IBM Quantum System One – Cleveland Clinic Newsroom

Cleveland Clinicand IBM have begundeployment of the first private sector onsite,IBM-managedquantum computer in the United States.The IBM Quantum Systemis to be located on Cleveland Clinics main campus in Cleveland.

The first quantum computer in healthcare, anticipated to be completed in early 2023, is a key part of the two organizations10-year partnership aimed at fundamentally advancing the pace of biomedical research through high-performance computing. Announced in 2021, the Cleveland Clinic-IBM Discovery Accelerator is a joint center that leverages Cleveland Clinics medical expertise with the technology expertise of IBM, including its leadership in quantum computing.

The current pace of scientific discovery is unacceptably slow, while our research needs are growing exponentially, said Lara Jehi, M.D., Cleveland Clinics Chief Research Information Officer. We cannot afford to continue to spend a decade or more going from a research idea in a lab to therapies on the market. Quantum offers a future to transform this pace, particularly in drug discovery and machine learning.

A step change in the way we solve scientific problems is on the horizon, said Ruoyi Zhou, Director, Ph.D., IBM Research Cleveland Clinic Partnership. At IBM, were more motivated than ever to create with Cleveland Clinic and others lasting communities of discovery and harness the power of quantum computing, AI and hybrid cloud to usher in a new era of accelerated discovery in healthcare and life sciences.

The Discovery Accelerator at Cleveland Clinic draws upon a variety of IBMs latest advancements in high performance computing, including:

Lara Jehi, M.D., and Ruoyi Zhou, Ph.D., at the site of the IBM Quantum System One on Cleveland Clinics main campus. (Courtesy: Cleveland Clinic/IBM)

The Discovery Accelerator also serves as the technology foundation for Cleveland Clinics Global Center for Pathogen Research & Human Health, part of the Cleveland Innovation District. The center, supported by a $500 million investment from the State of Ohio, Jobs Ohio and Cleveland Clinic, brings together a team focused on studying, preparing and protecting against emerging pathogens and virus-related diseases. Through Discovery Accelerator, researchers are leveraging advanced computational technology to expedite critical research into treatments and vaccines.

Together, the teams have already begun several collaborative projects that benefit from the new computational power. The Discovery Accelerator projects include a research study developing a quantum computing method to screen and optimize drugs targeted to specific proteins; improving a prediction model for cardiovascular risk following non-cardiac surgery; and using artificial intelligence to search genome sequencing findings and large drug-target databases to find effective, existing drugs that could help patients with Alzheimers and other diseases.

A significant part of the collaboration is a focus on educating the workforce of the future and creating jobs to grow the economy. An innovative educational curriculum has been designed for participants from high school to professional level, offering training and certification programs in data science, machine learning and quantum computing to build the skilled workforce needed for cutting-edge computational research of the future.

Read more here:
Cleveland Clinic and IBM Begin Installation of IBM Quantum System One - Cleveland Clinic Newsroom

The world, and todays employees, need quantum computing more than ever – VentureBeat

Did you miss a session from MetaBeat 2022? Head over to the on-demand library for all of our featured sessions here.

Quantum computing can soon address many of the worlds toughest, most urgent problems.

Thats why the semiconductor legislation Congress just passed is part of a $280 billion package that will, among other things, direct federal research dollars toward quantum computing.

Quantum computing will soon be able to:

The economy and the environment are clearly two top federal government agenda items.Congress in July was poised to pass the most ambitious climate bill in U.S. history. The New York Times said that the bill would pump hundreds of billions of dollars into low-carbon energy technologies like wind turbines, solar panels and electric vehicles and would put the United States on track to slash its greenhouse gas emissions to roughly 40% below 2005 levels by 2030. This could help to further advance and accelerate the adoption of quantum computing.

Low-Code/No-Code Summit

Join todays leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.

Because quantum technology can solve many previously unsolvable problems, a long list of the worlds leading businesses including BMW and Volkswagen, FedEx, Mastercard and Wells Fargo, and Merck and Roche are making significant quantum investments. These businesses understand that transformation via quantum computing, which is quickly advancing with breakthrough technologies, is coming soon. They want to be ready when that happens.

Its wise for businesses to invest in quantum computing because the risk is low and the payoff is going to be huge. As BCG notes: No one can afford to sit on the sidelines as this transformative technology accelerates toward several critical milestones.

The reality is that quantum computing is coming, and its likely not going to be a standalone technology. It will be tied to the rest of the IT infrastructure supercomputers, CPUs and GPUs.

This is why companies like Hewlett Packard Enterprise are thinking about how to integrate quantum computing into the fabric of the IT infrastructure. Its also why Terra Quantum AG is building hybrid data centers that combine the power of quantum and classical computing.

Amid these changes, employees should start now to get prepared. There is going to be a tidal wave of need for both quantum Ph.D.s and for other talent such as skilled quantum software developers to contribute to quantum efforts.

Earning a doctorate in a field relevant to quantum computing requires a multi-year commitment. But obtaining valuable quantum computing skills doesnt require a developer to go back to college, take out a student loan or spend years studying.

With modern tools that abstract the complexity of quantum software and circuit creation, developers no longer require Ph.D.-level knowledge to contribute to the quantum revolution, enabling a more diverse workforce to help businesses achieve quantum advantage. Just look at the winners in the coding competition that my company staged. Some of these winners were recent high school graduates, and they delivered highly innovative solutions.

Leading the software stack, quantum algorithm design platforms allow developers to design sophisticated quantum circuits that could not be created otherwise. Rather than defining tedious low-level gate connections, this approach uses high-level functional models and automatically searches millions of circuit configurations to find an implementation that fits resource considerations, designer-supplied constraints and the target hardware platform. New tools like Nvidias QODA also empower developers by making quantum programming similar to how classical programming is done.

Developers will want to familiarize themselves with quantum computing, whichwill be an integral arrow in their metaphorical quiver of engineering skills. People who add quantum skills to their classical programming and data center skills will position themselves to make more money and be more appealing to employers in the long term.

Many companies and countries are experimenting with and adopting quantum computing. They understand that quantum computing is evolving rapidly and is the way of the future.

Whether you are a business leader or a developer, its important to understand that quantum computing is moving forward. The train is leaving the station will you be on board?

Erik Garcell is technical marketing manager at Classiq.

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even considercontributing an articleof your own!

Read More From DataDecisionMakers

See the original post:
The world, and todays employees, need quantum computing more than ever - VentureBeat

CEO Jack Hidary on SandboxAQ’s Ambitions and Near-term Milestones – HPCwire

Spun out from Google last March, SandboxAQ is a fascinating, well-funded start-up targeting the intersection of AI and quantum technology. As the world enters the third quantum revolution, AI + Quantum software will address significant business and scientific challenges, is the companys broad self-described mission. Part software company, part investor, SandboxAQ foresees a blended classical computing-quantum computing landscape with AI infused throughout.

Its developing product portfolio comprises enterprise software for assessing and managing cryptography/data security in the so-called post-quantum era. NIST, of course, released its first official post-quantum algorithms in July and SandboxAQ is one of 12 companies selected to participate in its newproject Migration to Post Quantum Cryptography to build and commercialize tools. SandboxAQs AQ Analyzer product, says the company, is already available and being used by a few marquee customers.

Then theres SandboxAQs Strategic Investment Program, announced in August, which acquires or invests in technology companies of interest. So far, it has acquired one company (Cryptosense) and invested in two others (evolutionQ, and Qunnect).

Last week, HPCwire talked with SandboxAQ CEO Jack Hidary about the companys products and strategy. One has the sense that SandboxAQs aspirations are broad, and with nine figure funding, it has the wherewithal to pivot or expand. The A in the name stands for AI and the Q stands for quantum. One area not on the current agenda: building a quantum computer.

We want to sit above that layer. All these [qubit] technologies ion trap, and NV center (nitrogen vacancy center), neutral atoms, superconducting, photonic are very interesting and we encourage and mentor a lot of these companies who are quantum computing hardware companies. But we are not going to be building one because we really see our value as a layer on top of those computing [blocks], said Hidary. Google, of course, has another group working on quantum hardware.

Hidary joined Google in 2016 as Sandbox group director. A self-described serial entrepreneur, Hidarys varied experience includes founding EarthWeb, being a trustee of the XPrize Foundation, and running for Mayor in New York City in 2013. While at Google Sandbox, he wrote a textbook Quantum Computing: An Applied Approach.

I was recruited in to start a new division to focus on the use of AI and ultimately also quantum in solving really hard problems in the world. We realized that we needed to be multi-platform and focus on all the clouds and to do [other] kinds of stuff so we ended up spinning out earlier this year, said Hidary.

Eric Schmidt joined us about three and a half years ago as he wrapped up his chairmanship at Alphabet (Google parent company). He got really into what were doing, looking at the impact that scaled computation can have both on the AI side and the quantum side. He became chairman of SandboxAQ. I became CEO. Weve other backers like Marc Benioff from Salesforce and T. Rowe Price and Guggenheim, who are very long-term investors. What youll notice here thats interesting is we dont have short-term VCs. Wehave really long term investors who are here for 10 to 15 years.

The immediate focus is on post quantum cryptography tools delivered mostly by a SaaS model. By now were all familiar with the threat that fault-tolerant quantum computers will be able to crack conventionally encrypted (RSA) data using Shors algorithm. While fault-tolerant quantum computers are still many years away, the National Institute of Standards and Technology (NIST) and others, including SandboxAQ, have warned against Store Now/Decrypt Later attacks. (See HPCwire article, The Race to Ensure Post Quantum Data Security).

What adversaries are doing now is siphoning off information over VPNs. Theyre not cracking into your network. Theyre just doing it over VPNs, siphoning that information. They cant read it today, because its RSA protected, but theyll store it and read it in a number of years when they can, he said. The good news is you dont have to scrap your hardware. You could just upgrade the software. But thats still a monumental challenge. As you can imagine, for all the datacenters and the high-performance computing centers this is a non-trivial operation to do all that.

A big part of the problem is simply finding where encryption code is in existing infrastructure. That, in turn, has prompted calls for what is being called crypto-agility a comprehensive yet modular approach that allows easy swapping in-and-out cryptography code.

We want crypto-agility, and what we find is large corporations, large organizations, and large governments dont have crypto-agility. What were hoping is to develop tools to implement this idea. For example, as a first step to crypto-agility, were trying to see if people even have an MRI (discovery metaphor) machine for use on their own cybersecurity, and they really dont when it comes to encryption. Theres no diagnostic tools that these companies are using to find where their [encryption] footprint is or if they are encrypting everything appropriately. Maybe some stuff is not even being encrypted, said Hidary, who favors the MRI metaphor for a discovery tool.

No doubt, the need to modernize encryption/decryption methods and tools represents a huge problem and a huge market.

Without getting into technical details, Hidary said SandboxAQ is leveraging technology from its recent Cryptosense acquisition and internally developed technologies to develop a product portfolio planned to broadly encompass cryptography assessment, deployment and management. Its core current product is AQ Analyzer.

The idea, says Hidary returning to the MRI metaphor, is to take an MRI scan of inside the organization on-premise, cloud, private cloud, and so forth and this feeds into compliance vulnerabilities and post-quantum analysis. Its not just a quantum thing. Its about your general vulnerabilities on encryption. Overall, it happens to be that post quantum is helped by this, but this is a bigger issue. Then that feeds into your general sysops, network ops, and management tools that youre using.

AQ Analyzer, he says, is enterprise software that starts the process for organizations to become crypto-agile. Its now being used at large banks and telcos, and also by Mount Sinai Hospital. Healthcare replete with sensitive information is another early target for SandboxAQ. Long-term the idea is for Sandbox software tools to be able to automate much of the crypto management process from assessment to deployment through ongoing monitoring and management.

Thats the whole crypto-agility ballgame, says Hidary.

The business model, says Hidary, is carbon copy of Salesforce.coms SaaS model. Broadly, SandboxAQ uses a three-prong go-to-market via direct sales, global systems integrators in May it began programs with Ernst & Young (EY) and Deloitte and strategic partners/resellers. Vodafone and SoftBank are among the latter. Even though these are still early days for SandboxAQ as an independent entity, its moving fast, having benefitted from years of development inside Google. AQ Analyzer, said Hidary, is in general availability.

Were doing extremely well in banks and financial institutions. Theyre typically early adopters of cybersecurity because of the regulatory and compliance environment, and the trust they have with their customers, said Hidary.

Looking at near-term milestones, he said, Wed like to see a more global footprint of banks. Well be back in Europe soon now that we have Cryptosense (UK and Paris-based), and we have a local strong team in Europe. Weve had a lot of traction in the U.S. and the Canadian markets. So thats one key milestone over the next 18 months or so. Second, wed like to see [more adoption] into healthcare and telcos. We have Vodafone and Softbank mobile, on the telco side. We have Mount Sinai, wed like to see if that can be extended into additional players in those two spaces. The fourth vertical well probably go into is the energy grid. These are all critical infrastructure pieces of our society the financial structure of our society, energy, healthcare and the medical centers, the telecommunications grid.

While SandboxAQs AQ Analyzer is the companys first offering, its worth noting that the company aggressively looking for niches it can serve. For example, the company is keeping close tab on efforts to build a quantum internet.

Theres going to be a parallel quantum coherent internet to connect for distributed quantum computing, said Hidary. So nothing to do with cyber at all.

Our vision of the future that we share with I think everyone in the industry is that quantum does not take over classical, said Hidary. Its a mesh, a hybridization of CPU, GPU and quantum processing units. And the program, the code, in Python for example: part of it runs on CPUs, part of it on GPUs, and then yes, part of it will run on a QPU. In that mesh, youd want to have access both to the traditional Internet TCP IP today, but you also want to be able to connect over a quantum coherence intranet. So thats Qunnect.

Qunnect, of course, is one of the companies SandboxAQ has invested in and it is working on hardware (quantum memory and repeaters) to enable a quantum internet. Like dealing with post quantum cryptography, outfitting the quantum internet is likely to be as huge business. Looking at SandboxAQ, just seven months after being spun out from Google, the scope of its ambitions is hard to pin down.

Stay tuned.

See more here:
CEO Jack Hidary on SandboxAQ's Ambitions and Near-term Milestones - HPCwire

Cancer to Be Treated as Easily as Common Cold When Humans Crack Quantum Computing – Business Wire

DUBAI, United Arab of Emirates--(BUSINESS WIRE)--Breakthroughs in quantum computing will enable humans to cure diseases like cancer, Alzheimers, and Parkinsons as easily as we treat the common cold.

That was one of the major insights to emerge from the Dubai Future Forum, with renowned theoretical physicist Dr. Michio Kaku telling the worlds largest gathering of futurists that humanity should brace itself for major transformations in healthcare.

The forum concluded with a call for governments to institutionalize foresight and engrain it within decision making.

Taking place in Dubai, UAE at the Museum of the Future, Amy Webb, CEO of Future Today Institute, criticized nations for being too pre-occupied with the present and too focused on creating white papers, reports and policy recommendations instead of action.

Nowism is a virus. Corporations and governments are infected, she said.

One panel session heard how humans could be ready to test life on the Moon in just 15 years and be ready for life on Mars in another decade. Sharing his predictions for the future, Dr. Kaku also said there is a very good chance humans will pick up a signal from another intelligent life form this century.

Dr. Jamie Metzl, Founder and Chair, OneShared.World, urged people to eat more lab-grown meat to combat global warming and food insecurity.

If we are treating them like a means to an end of our nutrition, wouldnt it be better instead of growing the animal, to grow the meat? he said.

Among the 70 speakers participating in sessions were several UAE ministers. HE Mohammad Al Gergawi, UAE Minister of Cabinet Affairs, Vice Chairman, Board of Trustees and Managing Director of the Dubai Future Foundation, said ministers around the world should think of themselves as designers of the future. Our stakeholders are 7.98 billion people around the world, he noted.

Dubais approach to foresight was lauded by delegates, including HE Omar Sultan Al Olama, UAE Minister of State for Artificial Intelligence, Digital Economy, and Remote Work Applications, who said: What makes our city and nation successful is not natural resources, but a unique ability to embrace all ideas and individuals.

More than 30 sessions covered topics including immortality, AI sentience, climate change, terraforming, genome sequencing, legislation, and the energy transition.

*Source: AETOSWire

Originally posted here:
Cancer to Be Treated as Easily as Common Cold When Humans Crack Quantum Computing - Business Wire

New laboratory to explore the quantum mysteries of nuclear materials – EurekAlert

Replete with tunneling particles, electron wells, charmed quarks and zombie cats, quantum mechanics takes everything Sir Isaac Newton taught about physics and throws it out the window.

Every day, researchers discover new details about the laws that govern the tiniest building blocks of the universe. These details not only increase scientific understanding of quantum physics, but they also hold the potential to unlock a host of technologies, from quantum computers to lasers to next-generation solar cells.

But theres one area that remains a mystery even in this most mysterious of sciences: the quantum mechanics of nuclear fuels.

Until now, most fundamental scientific research of quantum mechanics has focused on elements such as silicon because these materials are relatively inexpensive, easy to obtain and easy to work with.

Now, Idaho National Laboratory researchers are planning to explore the frontiers of quantum mechanics with a new synthesis laboratory that can work with radioactive elements such as uranium and thorium.

An announcement about the new laboratory appears online in the journalNature Communications.

Uranium and thorium, which are part of a larger group of elements called actinides, are used as fuels in nuclear power reactors because they can undergo nuclear fission under certain conditions.

However, the unique properties of these elements, especially the arrangement of their electrons, also means they could exhibit interesting quantum mechanical properties.

In particular, the behavior of particles in special, extremely thin materials made from actinides could increase our understanding of phenomena such as quantum wells and quantum tunneling (see sidebar).

To study these properties, a team of researchers has built a laboratory around molecular beam epitaxy (MBE), a process that creates ultra-thin layers of materials with a high degree of purity and control.

The MBE technique itself is not new, said Krzysztof Gofryk, a scientist at INL. Its widely used. Whats new is that were applying this method to actinide materials uranium and thorium. Right now, this capability doesnt exist anywhere else in the world that we know of.

The INL team is conducting fundamental research science for the sake of knowledge but the practical applications of these materials could make for some important technological breakthroughs.

At this point, we are not interested in building a new qubit [the basis of quantum computing], but we are thinking about which materials might be useful for that, Gofryk said. Some of these materials could be potentially interesting for new memory banks and spin-based transistors, for instance.

Memory banks and transistors are both important components of computers.

To understand how researchers make these very thin materials, imagine an empty ball pit at a fast-food restaurant. Blue and red balls are thrown in the pit one at a time until they make a single layer on the floor. But that layer isnt a random assortment of balls. Instead, they arrange themselves into a pattern.

During the MBE process, the empty ball pit is a vacuum chamber, and the balls are highly pure elements, such as nitrogen and uranium, that are heated until individual atoms can escape into the chamber.

The floor of our imaginary ball pit is, in reality, a charged substrate that attracts the individual atoms. On the substrate, atoms order themselves to create a wafer of very thin material in this case, uranium nitride.

Back in the ball pit, weve created layer of blue and red balls arranged in a pattern. Now we make another layer of green and orange balls on top of the first layer.

To study the quantum properties of these materials, Gofryk and his team will join two dissimilar wafers of material into a sandwich called a heterostructure. For instance, the thin layer of uranium nitride might be joined to a thin layer of another material such as gallium arsenide, a semiconductor. At the junction between the two different materials, interesting quantum mechanical properties can be observed.

We can make sandwiches of these materials from a variety of elements, Gofryk said. We have lots of flexibility. We are trying to think about the novel structures we can create with maybe some predicted quantum properties.

We want to look at electronic properties, structural properties, thermal properties and how electrons are transported through the layers, he continued. What will happen if you lower the temperature and apply a magnetic field? Will it cause electrons to behave in certain way?

INL is one of the few places where researchers can work with uranium and thorium for this type of science. The amounts of the radioactive materials and the consequent safety concerns will be comparable to the radioactivity found in an everyday smoke alarm.

INL is the perfect place for this research because were interested in this kind of physics and chemistry, Gofryk said.

In the end, Gofryk hopes the laboratory will result in breakthroughs that help attract attention from potential collaborators as well as recruit new employees to the laboratory.

These actinides have such special properties, he said. Were hoping we can discover some new phenomena or new physics that hasnt been found before.

In 1900, German physicist Max Planck first described how light emitted from heated objects, such as the filament in a light bulb, behaved like particles.

Since then, numerous scientists including Albert Einstein and Niels Bohr have explored and expanded upon Plancks discovery to develop the field of physics known as quantum mechanics. In short, quantum mechanics describes the behavior of atoms and subatomic particles.

Quantum mechanics is different than regular physics, in part, because subatomic particles simultaneously have characteristics of both particles and waves, and their energy and movement occur in discrete amounts called quanta.

More than 120 years later, quantum mechanics plays a key role in numerous practical applications, especially lasers and transistors a key component of modern electronic devices. Quantum mechanics also promises to serve as the basis for the next generation of computers, known as quantum computers, which will be much more powerful at solving certain types of calculations.

Uranium, thorium and the other actinides have something in common that makes them interesting for quantum mechanics: the arrangement of their electrons.

Electrons do not orbit around the nucleus the way the earth orbits the sun. Rather, they zip around somewhat randomly. But we can define areas where there is a high probability of finding electrons. These clouds of probability are called orbitals.

For the smallest atoms, these orbitals are simple spheres surrounding the nucleus. However, as the atoms get larger and contain more electrons, orbitals begin to take on strange and complex shapes.

In very large atoms like uranium and thorium (92 and 90 electrons respectively), the outermost orbitals are a complex assortment of party balloon, jelly bean, dumbbell and hula hoop shapes. The electrons in these orbitals are high energy. While scientists can guess at their quantum properties, nobody knows for sure how they will behave in the real world.

Quantum tunneling is a key part of any number of phenomena, including nuclear fusion in stars, mutations in DNA and diodes in electronic devices.

To understand quantum tunneling, imagine a toddler rolling a ball at a mountain. In this analogy, the ball is a particle. The mountain is a barrier, most likely a semiconductor material. In classical physics, theres no chance the ball has enough energy to pass over the mountain.

But in the quantum realm, subatomic particles have properties of both particles and waves. The waves peak represents the highest probability of finding the particle. Thanks to a quirk of quantum mechanics, while most of the wave bounces off the barrier, a small part of that wave travels through if the barrier is thin enough.

For a single particle, the small amplitude of this wave means there is a very small chance of the particle making it to the other side of the barrier.

However, when large numbers of waves are travelling at a barrier, the probability increases, and sometimes a particle makes it through. This is quantum tunneling.

Quantum wells are also important, especially for devices such as light emitting diodes (LEDs) and lasers.

Like quantum tunneling, to build quantum wells, you need alternating layers of very thin (10 nanometers) material where one layer is a barrier.

While electrons normally travel in three dimensions, quantum wells trap electrons in two dimensions within a barrier that is, for practical purposes, impossible to overcome. These electrons exist at specific energies say the precise energies needed to generate specific wavelengths of light.

About Idaho National LaboratoryBattelle Energy Alliance manages INL for the U.S. Department of Energys Office of Nuclear Energy. INL is the nations center for nuclear energy research and development,and alsoperforms research in each of DOEs strategic goal areas: energy, national security, science and the environment. For more information, visitwww.inl.gov.Follow us on social media:Twitter,Facebook,InstagramandLinkedIn.

Here is the original post:
New laboratory to explore the quantum mysteries of nuclear materials - EurekAlert

What is Artificial Intelligence (AI)? | IBM

Artificial intelligence leverages computers and machines to mimic the problem-solving and decision-making capabilities of the human mind.

A number of definitions of artificial intelligence (AI) have surfaced over the last few decades. John McCarthy offers the following definition in this 2004 paper(PDF, 106 KB) (link resides outside IBM)): "It is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable."

However, decades before this definition, the artificial intelligence conversation began with Alan Turing's 1950 work "Computing Machinery and Intelligence" (PDF, 89.8 KB) (link resides outside of IBM). In this paper, Turing, often referred to as the "father of computer science", asks the following question: "Can machines think?"From there, he offers a test, now famously known as the "Turing Test", where a human interrogator would try to distinguish between a computer and human text response. While this test has undergone much scrutiny since its publication, it remains an important part of the history of AI.

One of the leading AI textbooks is Artificial Intelligence: A Modern Approach(link resides outside IBM, [PDF, 20.9 MB]), by Stuart Russell and Peter Norvig. In the book, they delve into four potential goals or definitions of AI, which differentiate computer systems as follows:

Human approach:

Ideal approach:

Alan Turings definition would have fallen under the category of systems that act like humans.

In its simplest form, artificial intelligence is a field that combines computer science and robust datasets to enable problem-solving. Expert systems, an early successful application of AI, aimed to copy a humans decision-making process. In the early days, it was time-consuming to extract and codify the humans knowledge.

AI today includes the sub-fields of machine learning and deep learning, which are frequently mentioned in conjunction with artificial intelligence. These disciplines are comprised of AI algorithms that typically make predictions or classifications based on input data. Machine learning has improved the quality of some expert systems, and made it easier to create them.

Today, AI plays an often invisible role in everyday life, powering search engines, product recommendations, and speech recognition systems.

There is a lot of hype about AI development, which is to be expected of any emerging technology. As noted in Gartners hype cycle (link resides outside IBM), product innovations like self-driving cars and personal assistants follow a typical progression of innovation, from overenthusiasm through a period of disillusionment to an eventual understanding of the innovations relevance and role in a market or domain. As Lex Fridman notes (01:08:15) (link resides outside IBM) in his 2019 MIT lecture, we are at the peak of inflated expectations, approaching the trough of disillusionment.

As conversations continue around AI ethics, we can see the initial glimpses of the trough of disillusionment. Read more about where IBM stands on AI ethics here.

Weak AIalso called Narrow AI or Artificial Narrow Intelligence (ANI)is AI trained to perform specific tasks. Weak AI drives most of the AI that surrounds us today. Narrow might be a more accurate descriptor for this type of AI as it is anything but weak; it enables some powerful applications, such as Apple's Siri, Amazon's Alexa, IBM Watson, and autonomous vehicles.

Strong AI is made up of Artificial General Intelligence (AGI) and Artificial Super Intelligence (ASI). Artificial General Intelligence (AGI), or general AI, is a theoretical form of AI where a machine would have an intelligence equal to humans; it would have a self-aware consciousness that has the ability to solve problems, learn, and plan for the future. Artificial Super Intelligence (ASI)also known as superintelligencewould surpass the intelligence and ability of the human brain. While strong AI is still entirely theoretical with no practical examples in use today, AI researchers are exploring its development. In the meantime, the best examples of ASI might be from science fiction, such as HAL, the rogue computer assistant in 2001: A Space Odyssey.

Since deep learning and machine learning tend to be used interchangeably, its worth noting the nuances between the two. As mentioned above, both deep learning and machine learning are sub-fields of artificial intelligence, and deep learning is actually a sub-field of machine learning.

The way in which deep learning and machine learning differ is in how each algorithm learns. "Deep" machine learning can use labeled datasets, also known as supervised learning, to inform its algorithm, but it doesnt necessarily require a labeled dataset. Deep learning can ingest unstructured data in its raw form (e.g. text, images), and it can automatically determine the set of features which distinguish different categories of data from one another. This eliminates some of the human intervention required and enables the use of larger data sets. You can think of deep learning as "scalable machine learning" as Lex Fridman notes in the same MIT lecture from above. Classical, or "non-deep", machine learning is more dependent on human intervention to learn. Human experts determine the set of features to understand the differences between data inputs, usually requiring more structured data to learn.

Deep learning (like some machine learning) uses neural networks. The deep in a deep learning algorithm refers to a neural network with more than three layers, including the input and output layers. This is generally represented using the following diagram:

The rise of deep learning has been one of the most significant breakthroughs in AI in recent years, because it has reduced the manual effort involved in building AI systems. Deep learning was in part enabled by big data and cloud architectures, making it possible to access huge amounts of data and processing power for training AI solutions.

There are numerous, real-world applications of AI systems today. Below are some of the most common examples:

Computer vision: This AI technology enables computers to derive meaningful information from digital images, videos, and other visual inputs, and then take the appropriate action. Powered by convolutional neural networks, computer vision has applications in photo tagging on social media, radiology imaging in healthcare, and self-driving cars within the automotive industry.

Recommendation engines: Using past consumption behavior data, AI algorithms can help to discover data trends that can be used to develop more effective cross-selling strategies. This approach is used by online retailers to make relevant product recommendations to customers during the checkout process.

Automated stock trading: Designed to optimize stock portfolios, AI-driven high-frequency trading platforms make thousands or even millions of trades per day without human intervention.

Fraud detection: Banks and other financial institutions can use machine learning to spot suspicious transactions. Supervised learning can train a model using information about known fraudulent transactions. Anomaly detection can identify transactions that look atypical and deserve further investigation.

Since the advent of electronic computing, some important events and milestones in the evolution of artificial intelligence include the following:

While Artificial General Intelligence remains a long way off, more and more businesses will adopt AI in the short term to solve specific challenges. Gartner predicts (link resides outside IBM) that 50% of enterprises will have platforms to operationalize AI by 2025 (a sharp increase from 10% in 2020).

Knowledge graphs are an emerging technology within AI. They can encapsulate associations between pieces of information and drive upsell strategies, recommendation engines, and personalized medicine. Natural language processing (NLP) applications are also expected to increase in sophistication, enabling more intuitive interactions between humans and machines.

IBM has been a leader in advancing AI-driven technologies for enterprises and has pioneered the future of machine learning systems for multiple industries. Based on decades of AI research, years of experience working with organizations of all sizes, and on learnings from over 30,000 IBM Watson engagements, IBM has developed the AI Ladder for successful artificial intelligence deployments:

IBM Watson gives enterprises the AI tools they need to transform their business systems and workflows, while significantly improving automation and efficiency. For more information on how IBM can help you complete your AI journey, explore the IBM portfolio of managed services and solutions

Sign up for an IBMid and create your IBM Cloud account.

Go here to read the rest:

What is Artificial Intelligence (AI)? | IBM