Cybersecurity in the quantum era – ETCIO.com

By Tirthankar Dutta

On October 23rd, 2019, Google claimed that they had achieved Quantum supremacy by solving a particularly difficult problem in 200 seconds by using their quantum computer, which is also known as "sycamore." This performance was compared with a Supercomputer known as 'Summit" and built by IBM. According to Google, this classical computer would have taken 10,000 years to solve the same problem.

The advancement of large quantum computers, along with the more computational power it will bring, could have dire consequences for cybersecurity. It is well known that important problems such as factoring, whose considered hardness ensures the security of many widely used protocols (RSA, DSA, ECDSA), can be solved efficiently, if a quantum computer that is sufficiently large, "fault-tolerant" and universal, is developed. However, addressing the imminent risk that adversaries equipped with quantum technologies pose is not the only issue in cybersecurity where quantum technologies are bound to play a role.

Because quantum computing speeds up prime number factorization, computers enabled with that technology can easily break cryptographic keys by quickly calculating or exhaustively searching secret keys. A task considered computationally infeasible by a conventional computer becomes painfully easy, compromising existing cryptographic algorithms used across the board. In the future, even robust cryptographic algorithms will be substantially weakened by quantum computing, while others will no longer be secure at all:

There would be many disconnects on the necessity to change the current cryptographic protocols and infrastructure to counter quantum technologies in a negative way, but we can't deny the fact that future adversaries might use this kind of technology to their benefit. As it allows them to work on millions of computations in parallel, exponentially speeding up the time it takes to process a task.

According to the National, Academies Study notes, "the current quantum computers have very little processing power and are too error-prone to crack today's strong codes. The future code-breaking quantum computers would need 100,000 times more processing power and an error rate 100 times better than today's best quantum computers have achieved. The study does not predict how long these advances might takebut it did not expect them to happen within a decade."

But does this mean that we should wait and watch the evolution of quantum computing, or should we go back to our drawing board to create quantum-resistant cryptography? Thankfully, researchers have been working on a public-key cryptography algorithm that can counter code-breaking efforts by quantum computers. US National Institute of Standards and Technology (NIST) evaluating 69 potential new methods for what it calls "post-quantum cryptography." The institution expects to have a draft standard by 2024, which would then be added to web browsers and other internet applications and systems

No matter when dominant quantum computing arrives, it poses a large security threat. Because the process of adopting new standards can take years, it is wise to begin planning for quantum-resistant cryptography now.

The author is SVP and Head of Information Security at Infoedge.

DISCLAIMER: The views expressed are solely of the author and ETCIO.com does not necessarily subscribe to it. ETCIO.com shall not be responsible for any damage caused to any person/organisation directly or indirectly.

See the rest here:
Cybersecurity in the quantum era - ETCIO.com

Topological Quantum Computing Market Outlook From 2020 to 2028 With Impact of COVID-19, Technology, Top Companies, Demand Forecast, Revenue Analysis …

The total information and communication technology goods (including computers, peripheral devices, communication and consumer electronic components among other IT goods) exports registered a growth rate of 11.5% in 2017 as against 10.5% in 2012.

CRIFAX added a report onGlobal Topological Quantum Computing Market, 2020-2028to its database of market research collaterals consisting of overall market scenario with prevalent and future growth prospects, among other growth strategies used by key players to stay ahead of the game. Additionally, recent trends, mergers and acquisitions, region-wise growth analysis along with challenges that are affecting the growth of the market are also stated in the report.

Get Final Sample This Strategic Report will cover the impact analysis of COVID-19 on this industry (Global and Regional Market)@https://www.crifax.com/sample-request-1010075

The emergence of new technological innovations including recent technologies such as DDI (AI) and Internet of Things (IoT) finding their usage across both industrial and residential applications and rapid pace of Public Field Service Management Software taking place across various industries is estimated to drive the growth of the global Public Field Service Management Software Market over the forecast period (2019-2027). The manufacturing industries are predicted to invest about USD 340 billion on Public Field Service Management Software in 2019. Investments in robotics, autonomous and freight operations are estimated to generate revenues of USD 128 billion in the same year. With transformation of business models happening on account of emergence of AI, IoT and Robotics, the global Public Field Service Management Software Market is estimated to observe significant growth over the next 6-7 years.

The introduction of 5G network is anticipated to provide various business opportunities as well as tap additional sources of revenue for the telecom industries, on account of increase in speed and responsiveness of the wireless networks.

Get Final Sample This Strategic Report will cover the impact analysis of COVID-19 on this industry (Global and Regional Market)@https://www.crifax.com/sample-request-1010075

With growing demand for mobile data along with increased video streaming services, the adoption of 5G services in North America is estimated to cross 45% by 2023. The rollout of 5G network combined with IoT connectivity which includesTopological Quantum Computing, connected homes or connected cities is predicted to change the way the telecom operators perform their tasks. United Nations Conference on Trade & Development (UNCTAD) in its report stated that the percentage of total information and communication technology goods (including computers, peripheral devices, communication and consumer electronic components among other IT goods) exports had grown from 10.5% in 2012 to 11.5% in 2017. As of 2017, Hong Kong held the largest share of 51.7% in ICT goods exports among four nations, which was followed by Philippines (35.9%), Singapore (32%) and Malaysia (31%).Moreover, growth of the global economy along with several efforts taken by countries such as China, Japan, United States of America, Germany, Netherlands, Korea and other ICT goods exporting nations is anticipated to aid the growth of the IT and Telecom sector. To provide better understanding of internal and external marketing factors, the multi-dimensional analytical tools such as SWOT and PESTEL analysis have been implemented in the globalTopological Quantum Computing Marketreport. Moreover, the report consists of market segmentation, CAGR (Compound Annual Growth Rate), BPS analysis, Y-o-Y growth (%), Porters five force model, absolute $ opportunity and anticipated cost structure of the market.

About CRIFAX

CRIFAX is driven by integrity and commitment to its clients and provides cutting-edge marketing research and consulting solutions with a step-by-step guide to accomplish their business prospects. With the help of our industry experts having hands on experience in their respective domains, we make sure that our industry enthusiasts understand all the business aspects relating to their projects, which further improves the consumer base and the size of their organization. We offer wide range of unique marketing research solutions ranging from customized and syndicated research reports to consulting services, out of which, we update our syndicated research reports annually to make sure that they are modified according to the latest and ever-changing technology and industry insights. This has helped us to carve a niche in delivering distinctive business services that enhanced our global clients trust in our insights and helped us to outpace our competitors as well.

For More Update Follow:-LinkedIn|Twitter

Contact Us:

CRIFAX

Email: [emailprotected]

U.K. Phone: +44 161 394 2021

U.S. Phone: +1 917 924 8284

More Related Reports:-

Digital Risk Protection (DRP) Tools MarketDrone Surveillance Platform MarketWarehousing Automation MarketDirect Store Delivery Software MarketAdvanced Threat Protection (ATP) MarketTransportation Management System Integrator MarketNetwork Management Solution MarketMedical Image Processing Software MarketAutomotive Wiper Component After MarketMembrane Water and Wastewater Treatment MarketIoT Connectivity MarketEnvironmental Hazard Monitoring Software MarketMicroscope Software MarketManaged Infrastructure Service MarketWireless Asset Management MarketSemiconductor (Silicon) Intellectual Property MarketMobile Content Delivery Network MarketBitcoin Technology MarketData Integration and Integrity Software MarketHealthcare Architecture Service MarketScalable Software Defined Networking MarketCarbon Management System Market

Originally posted here:
Topological Quantum Computing Market Outlook From 2020 to 2028 With Impact of COVID-19, Technology, Top Companies, Demand Forecast, Revenue Analysis ...

Regional Analysis and Strategies of Quantum Computing Technology Market during the Forecasted Period 2020-2030 – 3rd Watch News

The Quantum Computing Technology Market Research Report 2020 published by Prophecy Market Insights is an all-inclusive business research study on the current state of the industry which analyzes innovative strategies for business growth and describes significant factors such as top developers/manufacturers, production value, key regions, and growth rate. Impact of Covid-19 pandemic on the market will be completely analyzed in this report and it will also quantify the impact of this pandemic on the market.

The research study encompasses an evaluation of the market, including growth rate, current scenario, and volume inflation prospects, based on DROT and Porters Five Forces analyses. The market study pitches light on the various factors that are projected to impact the overall market dynamics of the Quantum Computing Technology market over the forecast period (2019-2029).

Regional Overview:

The survey report includes a vast investigation of the geographical scene of the Quantum Computing Technology market, which is manifestly arranged into the localities. The report provides an analysis of regional market players operating in the specific market and outcomes related to the target market for more than 20 countries.

Australia, New Zealand, Rest of Asia-Pacific

The facts and data are represented in the Quantum Computing Technology report using graphs, pie charts, tables, figures and graphical representations helping analyze worldwide key trends & statistics on the state of the industry and is a valuable source of guidance and direction for companies and individuals interested in the market.

Get Sample Copy of This Report @ https://www.prophecymarketinsights.com/market_insight/Insight/request-sample/3845

The research report also focuses on global major leading industry players of Quantum Computing Technology market report providing information such as company profiles, product picture and specification, R&D developments, distribution & production capacity, distribution channels, price, cost, revenue and contact information. The research report examines, legal policies, and competitive analysis between the leading and emerging and upcoming market trends.

Quantum Computing TechnologyMarket Key Companies:

International Business Machines (IBM) Corporation, Google Inc., Microsoft Corporation, Qxbranch, LLC, Cambridge Quantum Computing Ltd., 1QB Information Technologies Inc., QC Ware Corp., Magiq Technologies Inc., D-Wave Systems Inc., and Rigetti & Co, Inc.

The predictions mentioned in the Quantum Computing Technology market report have been derived using proven research techniques, assumptions and methodologies. This market report states the overview, historical data along with size, share, growth, demand, and revenue of the global industry.

Segmentation Overview:

The report provides an in-depth analysis of the Quantum Computing Technology market segments and highlights the latest trending segment and major innovations in the market. In addition to this, it states the impact of these segments on the growth of the market. Apart from key players analysis provoking business-related decisions that are usually backed by prevalent market conditions, we also do substantial analysis of market based on COVID-19 impact, detailed analysis on economic, health and financial structure.

Request Discount @ https://www.prophecymarketinsights.com/market_insight/Insight/request-discount/3845

Key Questions Answered in Report:

Stakeholders Benefit:

About us:

Prophecy Market Insights is specialized market research, analytics, marketing/business strategy, and solutions that offers strategic and tactical support to clients for making well-informed business decisions and to identify and achieve high-value opportunities in the target business area. We also help our clients to address business challenges and provide the best possible solutions to overcome them and transform their business.

Contact Us:

Mr Alex (Sales Manager)

Prophecy Market Insights

Phone: +1 860 531 2701

Email: [emailprotected]

Read more:
Regional Analysis and Strategies of Quantum Computing Technology Market during the Forecasted Period 2020-2030 - 3rd Watch News

Healthcare Shopping: The new age of consumerism – The Financial Express

By Lalit Dash

Srishti, a 35-year-old HR professional, recently started experiencing palpitations and shortness of breath. While looking up on the Internet for information on the probable causes of her condition, she found an online health services platform where she could review portfolios of doctors and treatment options allowing her to shop for the best care provider and a treatment plan at a cost she could afford. Booking and paying for the appointment through the hospitals web interface made it easy for her to schedule the visit as per her convenience.

Post consultation she explored online pharmacies and got her medicine at the best rate, earning some loyalty points in the process. Srishtis situation could be ours. With the onset of digital transformation, the healthcare sector is witnessing a major overhaul. Today, an individual is not just a prospective patient, but a customer armed with a shopping list to select the best doctors, facilities and treatment at an affordable cost and at a time and location of her choice. The flow of information is no longer unidirectional (caregiver to care receiver) but bidirectional and consumer choices are made within and outside the clinical environment. This has led to the healthcare system to leapfrog from a legacy PDS (Public Distribution System) model to a supermarket model.

With an increased focus on the quality of consumer experience, healthcare companies are deploying technologies to make care delivery more accessible and personalised. Medical diagnostics, Internet of Medical Things (IoMT), Blockchain, Artificial Intelligence (AI) and data analytics are triggering disruptive innovations that are, in turn, redefining care paradigms.

Technology, as is evident, is a crucial cog in the evolution of consumerism in healthcare. Innovations in cloud computing, mobility solutions, telemedicine, and quantum computing are making their way into mainstream health operations. For instance, AI and ML are pushing this change through algorithms built for diagnostics of chronic diseases. Augmented reality/virtual reality (AR/VR)-led technology is already being put to use to set up virtual care systems that enable doctors to conduct surgeries in remote areas or during times of a public health emergency.

Natural Language Processing (NLP) technology a form of AI that enables computer programs to process and analyse unstructured data from different sources is extensively being used in technical documentation, leading to a faster diagnosis. Additionally, the gamification of healthcare particularly in-patient wellness is enhancing the customer (vs. patient) mindset and reciprocal engagement. Take for instance, mobile apps that run a rewards program for people who accomplish a health-related task every day or those that encourage participation of friends and family in fitness contests.

With the care providers focus shifting more towards value across customer lifecycle, there will be stronger collaboration between healthcare providers and customers for pre-, during- and post-care medical services. As healthcare consumerism continues to grow, healthcare providers will have to learn to adapt to this changing environment to guide and engage consumer as well as secure their loyalty. This will eventually lead to ease in access to care, reduced cost of care and enhanced quality of care benefitting many consumers such as Srishti.

The writer is senior director Technology, Optum Global Solutions

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, calculate your tax by Income Tax Calculator, know markets Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

See the rest here:
Healthcare Shopping: The new age of consumerism - The Financial Express

Deep learning’s role in the evolution of machine learning – TechTarget

Machine learning had a rich history long before deep learning reached fever pitch. Researchers and vendors were using machine learning algorithms to develop a variety of models for improving statistics, recognizing speech, predicting risk and other applications.

While many of the machine learning algorithms developed over the decades are still in use today, deep learning -- a form of machine learning based on multilayered neural networks -- catalyzed a renewed interest in AI and inspired the development of better tools, processes and infrastructure for all types of machine learning.

Here, we trace the significance of deep learning in the evolution of machine learning, as interpreted by people active in the field today.

The story of machine learning starts in 1943 when neurophysiologist Warren McCulloch and mathematician Walter Pitts introduced a mathematical model of a neural network. The field gathered steam in 1956 at a summer conference on the campus of Dartmouth College. There, 10 researchers came together for six weeks to lay the ground for a new field that involved neural networks, automata theory and symbolic reasoning.

The distinguished group, many of whom would go on to make seminal contributions to this new field, gave it the name artificial intelligence to distinguish it from cybernetics, a competing area of research focused on control systems. In some ways these two fields are now starting to converge with the growth of IoT, but that is a topic for another day.

Early neural networks were not particularly useful -- nor deep. Perceptrons, the single-layered neural networks in use then, could only learn linearly separable patterns. Interest in them waned after Marvin Minsky and Seymour Papert published the book Perceptrons in 1969, highlighting the limitations of existing neural network algorithms and causing the emphasis in AI research to shift.

"There was a massive focus on symbolic systems through the '70s, perhaps because of the idea that perceptrons were limited in what they could learn," said Sanmay Das, associate professor of computer science and engineering at Washington University in St. Louis and chair of the Association for Computing Machinery's special interest group on AI.

The 1973 publication of Pattern Classification and Scene Analysis by Richard Duda and Peter Hart introduced other types of machine learning algorithms, reinforcing the shift away from neural nets. A decade later, Machine Learning: An Artificial Intelligence Approach by Ryszard S. Michalski, Jaime G. Carbonell and Tom M. Mitchell further defined machine learning as a domain driven largely by the symbolic approach.

"That catalyzed a whole field of more symbolic approaches to [machine learning] that helped frame the field. This led to many Ph.D. theses, new journals in machine learning, a new academic conference, and even helped to create new laboratories like the NASA Ames AI Research branch, where I was deputy chief in the 1990s," said Monte Zweben, CEO of Splice Machine, a scale-out SQL platform.

In the 1990s, the evolution of machine learning made a turn. Driven by the rise of the internet and increase in the availability of usable data, the field began to shift from a knowledge-driven approach to a data-driven approach, paving the way for the machine learning models that we see today.

The turn toward data-driven machine learning in the 1990s was built on research done by Geoffrey Hinton at the University of Toronto in the mid-1980s. Hinton and his team demonstrated the ability to use backpropagation to build deeper neural networks.

"This was a major breakthrough enabling new kinds of pattern recognition that were previously not feasible with neural nets," Zweben said. This added new layers to the networks and a way to strengthen or weaken connections back across many layers in the network, leading to the term deep learning.

Although possible in a lab setting, deep learning did not immediately find its way into practical applications, and progress stalled.

"Through the '90s and '00s, a joke used to be that 'neural networks are the second-best learning algorithm for any problem,'" Washington University's Das said.

Meanwhile, commercial interest in AI was starting to wane because the hype around developing an AI on par with human intelligence had gotten ahead of results, leading to an AI winter, which lasted through the 1980s. What did gain momentum was a type of machine learning using kernel methods and decision trees that enabled practical commercial applications.

Still, the field of deep learning was not completely in retreat. In addition to the ascendancy of the internet and increase in available data, another factor proved to be an accelerant for neural nets, according to Zweben: namely, distributed computing.

Machine learning requires a lot of compute. In the early days, researchers had to keep their problems small or gain access to expensive supercomputers, Zweben said. The democratization of distributed computing in the early 2000s enabled researchers to run calculations across clusters of relatively low-cost commodity computers.

"Now, it is relatively cheap and easy to experiment with hundreds of models to find the best combination of data features, parameters and algorithms," Zweben said. The industry is pushing this democratization even further with practices and associated tools for machine learning operations that bring DevOps principles to machine learning deployment, he added.

Machine learning is also only as good as the data it is trained on, and if data sets are small, it is harder for the models to infer patterns. As the data created by mobile, social media, IoT and digital customer interactions grew, it provided the training material deep learning techniques needed to mature.

By 2012, deep learning attained star status after Hinton's team won ImageNet, a popular data science challenge, for their work on classifying images using neural networks. Things really accelerated after Google subsequently demonstrated an approach to scaling up deep learning across clusters of distributed computers.

"The last decade has been the decade of neural networks, largely because of the confluence of the data and computational power necessary for good training and the adaptation of algorithms and architectures necessary to make things work," Das said.

Even when deep neural networks are not used directly, they indirectly drove -- and continue to drive -- fundamental changes in the field of machine learning, including the following:

Deep learning's predictive power has inspired data scientists to think about different ways of framing problems that come up in other types of machine learning.

"There are many problems that we didn't think of as prediction problems that people have reformulated as prediction problems -- language, vision, etc. -- and many of the gains in those tasks have been possible because of this reformulation," said Nicholas Mattei, assistant professor of computer science at Tulane University and vice chair of the Association for Computing Machinery's special interest group on AI.

In language processing, for example, a lot of the focus has moved toward predicting what comes next in the text. In computer vision as well, many problems have been reformulated so that, instead of trying to understand geometry, the algorithms are predicting labels of different parts of an image.

The power of big data and deep learning is changing how models are built. Human analysis and insights are being replaced by raw compute power.

"Now, it seems that a lot of the time we have substituted big databases, lots of GPUs, and lots and lots of machine time to replace the deep problem introspection needed to craft features for more classic machine learning methods, such as SVM [support vector machine] and Bayes," Mattei said, referring to the Bayesian networks used for modeling the probabilities between observations and outcomes.

The art of crafting a machine learning problem has been taken over by advanced algorithms and the millions of hours of CPU time baked into pretrained models so data scientists can focus on other projects or spend more time on customizing models.

Deep learning is also helping data scientists solve problems with smaller data sets and to solve problems in cases where the data has not been labeled.

"One of the most relevant developments in recent times has been the improved use of data, whether in the form of self-supervised learning, improved data augmentation, generalization of pretraining tasks or contrastive learning," said Juan Jos Lpez Murphy, AI and big data tech director lead at Globant, an IT consultancy.

These techniques reduce the need for manually tagged and processed data. This is enabling researchers to build large models that can capture complex relationships representing the nature of the data and not just the relationships representing the task at hand. Lpez Murphy is starting to see transfer learning being adopted as a baseline approach, where researchers can start with a pretrained model that only requires a small amount of customization to provide good performance on many common tasks.

There are specific fields where deep learning provides a lot of value, in image, speech and natural language processing, for example, as well as time series forecasting.

"The broader field of machine learning is enhanced by deep learning and its ability to bring context to intelligence. Deep learning also improves [machine learning's] ability to learn nonlinear relationships and manage dimensionality with systems like autoencoders," said Luke Taylor, founder and COO at TrafficGuard, an ad fraud protection service.

For example, deep learning can find more efficient ways to auto encode the raw text of characters and words into vectors representing the similarity and differences of words, which can improve the efficiency of the machine learning algorithms used to process it. Deep learning algorithms that can recognize people in pictures make it easier to use other algorithms that find associations between people.

More recently, there have been significant jumps using deep learning to improve the use of image, text and speech processing through common interfaces. People are accustomed to speaking to virtual assistants on their smartphones and using facial recognition to unlock devices and identify friends in social media.

"This broader adoption creates more data, enables more machine learning refinement and increases the utility of machine learning even further, pushing even further adoption of this tech into people's lives," Taylor said.

Early machine learning research required expensive software licenses. But deep learning pioneers began open sourcing some of the most powerful tools, which has set a precedent for all types of machine learning.

"Earlier, machine learning algorithms were bundled and sold under a licensed tool. But, nowadays, open source libraries are available for any type of AI applications, which makes the learning curve easy," said Sachin Vyas, vice president of data, AI and automation products at LTI, an IT consultancy.

Another factor in democratizing access to machine learning tools has been the rise of Python.

"The wave of open source frameworks for deep learning cemented the prevalence of Python and its data ecosystem for research, development and even production," Globant's Lpez Murphy said.

Many of the different commercial and free options got replaced, integrated or connected to a Python layer for widespread use. As a result, Python has become the de facto lingua franca for machine learning development.

Deep learning has also inspired the open source community to automate and simplify other aspects of the machine learning development lifecycle. "Thanks to things like graphical user interfaces and [automated machine learning], creating working machine learning models is no longer limited to Ph.D. data scientists," Carmen Fontana, IEEE member and cloud and emerging tech practice lead at Centric Consulting, said.

For machine learning to keep evolving, enterprises will need to find a balance between developing better applications and respecting privacy.

Data scientists will need to be more proactive in understanding where their data comes from and the biases that may inadvertently be baked into it, as well as develop algorithms that are transparent and interpretable. They also need to keep pace with new machine learning protocols and the different ways these can be woven together with various data sources to improve applications and decisions.

"Machine learning provides more innovative applications for end users, but unless we're choosing the right data sets and advancing deep learning protocols, machine learning will never make the transition from computing a few results to providing actual intelligence," said Justin Richie, director of data science at Nerdery, an IT consultancy.

"It will be interesting to see how this plays out in different industries and if this progress will continue even as data privacy becomes more stringent," Richie said.

Originally posted here:
Deep learning's role in the evolution of machine learning - TechTarget

My Invisalign app uses machine learning and facial recognition to sell the benefits of dental work – TechRepublic

Align Technology uses DevSecOps tactics to keep complex projects on track and align business and IT goals.

Image: AndreyPopov/Getty Images/iStockphoto

Align Technology's Chief Digital Officer Sreelakshmi Kolli is using machine learning and DevOps tactics to power the company's digital transformation.

Kolli led the cross-functional team that developed the latest version of the company's My Invisalign app. The app combines several technologies into one product including virtual reality, facial recognition, and machine learning. Kolli said that using a DevOps approach helped to keep this complex work on track.

"The feasibility and proof of concept phase gives us an understanding of how the technology drives revenue and/or customer experience," she said. "Modular architecture and microservices allows incremental feature delivery that reduces risk and allows for continuous delivery of innovation."

SEE: Research: Microservices bring faster application delivery and greater flexibility to enterprises (TechRepublic Premium)

The customer-facing app accomplishes several goals at once, the company said:

More than 7.5 million people have used the clear plastic molds to straighten their teeth, the company said. Align Technology has used data from these patients to train a machine learning algorithm that powers the visualization feature in the mobile app. The SmileView feature uses machine learning to predict what a person's smile will look like when the braces come off.

Kolli started with Align Technology as a software engineer in 2003. Now she leads an integrated software engineering group focused on product technology strategy and development of global consumer, customer and enterprise applications and infrastructure. This includes end user and cloud computing, voice and data networks and storage. She also led the company's global business transformation initiative to deliver platforms to support customer experience and to simplify business processes.

Kolli used the development process of the My Invisalign app as an opportunity to move the dev team to DevSecOps practices. Kolli said that this shift represents a cultural change, and making the transition requires a common understanding among all teams on what the approach means to the engineering lifecycle.

"Teams can make small incremental changes to get on the DevSecOps journey (instead of a large transformation initiative)," she said. "Investing in automation is also a must for continuous integration, continuous testing, continuous code analysis and vulnerability scans." To build the machine learning expertise required to improve and support the My Invisalign app, she has hired team members with that skill set and built up expertise internally.

"We continue to integrate data science to all applications to deliver great visualization experiences and quality outcomes," she said.

Align Technology uses AWS to run its workloads.

In addition to keeping patients connected with orthodontists, the My Invisalign app is a marketing tool to convince families to opt for the transparent but expensive alternative to metal braces.

Kolli said that IT leaders should work closely with business leaders to make sure initiatives support business goals such as revenue growth, improved customer experience, or operational efficiencies, and modernize the IT operation as well.

"Making the line of connection between the technology tasks and agility to go to market helps build shared accountability to keep technical debt in control," she said.

Align Technology released the revamped app in late 2019. In May of this year, the company released a digital version tool for doctors that combines a photo of the patient's face with their 3D Invisalign treatment plan.

This ClinCheck "In-Face" Visualization is designed to help doctors manage patient treatment plans.

The visualization workflow combines three components of Align's digital treatment platform: Invisalign Photo Uploader for patient photos, the iTero intraoral scanner to capture data needed for the 3D model of the patient's teeth, and ClinCheck Pro 6.0. ClinCheck Pro 6.0 allows doctors to modify treatment plans through 3D controls.

These new product releases are the first in a series of innovations to reimagine the digital treatment planning process for doctors, Raj Pudipeddi, Align's chief innovation, product, and marketing officer and senior vice president, said in a press release about the product.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Read more from the original source:
My Invisalign app uses machine learning and facial recognition to sell the benefits of dental work - TechRepublic

2 books to strengthen your command of python machine learning – TechTalks

Image credit: Depositphotos

This post is part ofAI education, a series of posts that review and explore educational content on data science and machine learning. (In partnership withPaperspace)

Mastering machine learning is not easy, even if youre a crack programmer. Ive seen many people come from a solid background of writing software in different domains (gaming, web, multimedia, etc.) thinking that adding machine learning to their roster of skills is another walk in the park. Its not. And every single one of them has been dismayed.

I see two reasons for why the challenges of machine learning are misunderstood. First, as the name suggests, machine learning is software that learns by itself as opposed to being instructed on every single rule by a developer. This is an oversimplification that many media outlets with little or no knowledge of the actual challenges of writing machine learning algorithms often use when speaking of the ML trade.

The second reason, in my opinion, are the many books and courses that promise to teach you the ins and outs of machine learning in a few hundred pages (and the ads on YouTube that promise to net you a machine learning job if you pass an online course). Now, I dont what to vilify any of those books and courses. Ive reviewed several of them (and will review some more in the coming weeks), and I think theyre invaluable sources for becoming a good machine learning developer.

But theyre not enough. Machine learning requires both good coding and math skills and a deep understanding of various types of algorithms. If youre doing Python machine learning, you have to have in-depth knowledge of many libraries and also master the many programming and memory-management techniques of the language. And, contrary to what some people say, you cant escape the math.

And all of that cant be summed up in a few hundred pages. Rather than a single volume, the complete guide to machine learning would probably look like Donald Knuths famous The Art of Computer Programming series.

So, what is all this tirade for? In my exploration of data science and machine learning, Im always on the lookout for books that take a deep dive into topics that are skimmed over by the more general, all-encompassing books.

In this post, Ill look at Python for Data Analysis and Practical Statistics for Data Scientists, two books that will help deepen your command of the coding and math skills required to master Python machine learning and data science.

Python for Data Analysis, 2nd Edition, is written by Wes McKinney, the creator of the pandas, one of key libraries using in Python machine learning. Doing machine learning in Python involves loading and preprocessing data in pandas before feeding them to your models.

Most books and courses on machine learning provide an introduction to the main pandas components such as DataFrames and Series and some of the key functions such as loading data from CSV files and cleaning rows with missing data. But the power of pandas is much broader and deeper than what you see in a chapters worth of code samples in most books.

In Python for Data Analysis, McKinney takes you through the entire functionality of pandas and manages to do so without making it read like a reference manual. There are lots of interesting examples that build on top of each other and help you understand how the different functions of pandas tie in with each other. Youll go in-depth on things such as cleaning, joining, and visualizing data sets, topics that are usually only discussed briefly in most machine learning books.

Youll also get to explore some very important challenges, such as memory management and code optimization, which can become a big deal when youre handling very large data sets in machine learning (which you often do).

What I also like about the book is the finesse that has gone into choosing subjects to fit in the 500 pages. While most of the book is about pandas, McKinney has taken great care to complement it with material about other important Python libraries and topics. Youll get a good overview of array-oriented programming with numpy, another important Python library often used in machine learning in concert with pandas, and some important techniques in using Jupyter Notebooks, the tool of choice for many data scientists.

All this said, dont expect Python for Data Analysis to be a very fun book. It can get boring because it just discusses working with data (which happens to be the most boring part of machine learning). There wont be any end-to-end examples where youll get to see the result of training and using a machine learning algorithm or integrating your models in real applications.

My recommendation: You should probably pick up Python for Data Analysis after going through one of the introductory or advanced books on data science or machine learning. Having that introductory background on working with Python machine learning libraries will help you better grasp the techniques introduced in the book.

While Python for Data Analysis improves your data-processing and -manipulation coding skills, the second book well look at, Practical Statistics for Data Scientists, 2nd Edition, will be the perfect resource to deepen your understanding of the core mathematical logic behind many key algorithms and concepts that you often deal with when doing data science and machine learning.

The book starts with simple concepts such as different types of data, means and medians, standard deviations, and percentiles. Then it gradually takes you through more advanced concepts such as different types of distributions, sampling strategies, and significance testing. These are all concepts you have probably learned in math class or read about in data science and machine learning books.

But again, the key here is specialization.

On the one hand, the depth that Practical Statistics for Data Scientists brings to each of these topics is greater than youll find in machine learning books. On the other hand, every topic is introduced along with coding examples in Python and R, which makes it more suitable than classic statistics textbooks on statistics. Moreover, the authors have done a great job of disambiguating the way different terms are used in data science and other fields. Each topic is accompanied by a box that provides all the different synonyms for popular terms.

As you go deeper into the book, youll dive into the mathematics of machine learning algorithms such as linear and logistic regression, K-nearest neighbors, trees and forests, and K-means clustering. In each case, like the rest of the book, theres more focus on whats happening under the algorithms hood rather than using it for applications. But the authors have again made sure the chapters dont read like classic math textbooks and the formulas and equations are accompanied by nice coding examples.

Like Python for Data Analysis, Practical Statistics for Data Scientists can get a bit boring if you read it end to end. There are no exciting applications or a continuous process where you build your code through the chapters. But on the other hand, the book has been structured in a way that you can read any of the sections independently without the need to go through previous chapters.

My recommendation: Read Practical Statistics for Data Scientists after going through an introductory book on data science and machine learning. I definitely recommend reading the entire book once, though to make it more enjoyable, go topic by topic in-between your exploration of other machine learning courses. Also keep it handy. Youll probably revisit some of the chapters from time to time.

I would definitely count Python for Data Analysis and Practical Statistics for Data Scientists as two must-reads for anyone who is on the path of learning data science and machine learning. Although they might not be as exciting as some of the more practical books, youll appreciate the depth they add to your coding and math skills.

View post:
2 books to strengthen your command of python machine learning - TechTalks

What I Learned From Looking at 200 Machine Learning Tools – Machine Learning Times – machine learning & data science news – The Predictive…

Originally published in Chip Huyen Blog, June 22, 2020

To better understand the landscape of available tools for machine learning production, I decided to look up every AI/ML tool I could find. The resources I used include:

After filtering out applications companies (e.g. companies that use ML to provide business analytics), tools that arent being actively developed, and tools that nobody uses, I got 202 tools. See the full list. Please let me know if there are tools you think I should include but arent on the list yet!

Disclaimer

This post consists of 6 parts:

I. OverviewII. The landscape over timeIII. The landscape is under-developedIV. Problems facing MLOpsV. Open source and open-coreVI. Conclusion

I. OVERVIEW

In one way to generalize the ML production flow that I agreed with, it consists of 4 steps:

I categorize the tools based on which step of the workflow that it supports. I dont include Project setup since it requires project management tools, not ML tools. This isnt always straightforward since one tool might help with more than one step. Their ambiguous descriptions dont make it any easier: we push the limits of data science, transforming AI projects into real-world business outcomes, allows data to move freely, like the air you breathe, and my personal favorite: we lived and breathed data science.

I put the tools that cover more than one step of the pipeline into the category that they are best known for. If theyre known for multiple categories, I put them in the All-in-one category. I also include the Infrastructure category to include companies that provide infrastructure for training and storage. Most of these are Cloud providers.

To continue reading this article click here.

Go here to see the original:
What I Learned From Looking at 200 Machine Learning Tools - Machine Learning Times - machine learning & data science news - The Predictive...

Letters to the editor – The Economist

Jul 4th 2020

Artificial intelligence is an oxymoron (Technology quarterly, June 13th). Intelligence is an attribute of living things, and can best be defined as the use of information to further survival and reproduction. When a computer resists being switched off, or a robot worries about the future for its children, then, and only then, may intelligence flow.

I acknowledge Richard Suttons bitter lesson, that attempts to build human understanding into computers rarely work, although there is nothing new here. I was aware of the folly of anthropomorphism as an AI researcher in the mid-1980s. We learned to fly when we stopped emulating birds and studied lift. Meaning and knowledge dont result from symbolic representation; they relate directly to the visceral motives of survival and reproduction.

Great strides have been made in widening the applicability of algorithms, but as Mr Sutton says, this progress has been fuelled by Moores law. What we call AI is simply pattern discovery. Brilliant, transformative, and powerful, but just pattern discovery. Further progress is dependent on recognising this simple fact, and abandoning the fancy that intelligence can be disembodied from a living host.

ROB MACDONALDRichmond, North Yorkshire

I agree that machine learning is overhyped. Indeed, your claim that such techniques are loosely based on the structure of neurons in the brain is true of neural networks, but these are just one type among a wide array of different machine- learning methods. In fact, machine learning in some cases is no more than a rebranding of existing processes. If by machine learning we simply mean building a model using large amounts of data, then good old ordinary least squares (line of best fit) is a form of machine learning.

TOM ARMSTRONGToronto

The scope of your research into green investing was too narrow to condemn all financial services for their woolly thinking (Hotting up, June 20th). You restricted your analysis to microeconomic factors and to the ability of investors to engage with companies. It overlooked the bigger picture: investors can also shape the macro environment by structured engagement with the system itself.

For example, the data you used largely originated from the investor-led Carbon Disclosure Project (for which we hosted the first ever meeting, nearly two decades ago). In addition, investors have also helped shape sustainable-finance plans in Britain, the EU and UN. Investors also sit on the industry-led Taskforce on Climate-related Financial Disclosure, convened by the Financial Stability Board, which has proved effective.

It is critical that governments apply a meaningful carbon price. But if we are to move money at the pace and scale required to deal with climate risk, governments need to reconsider the entire architecture of markets. This means focusing a wide-angled climate lens on prudential regulation, listing rules, accounting standards, investor disclosure standards, valuation conventions and stewardship codes, as well as building on new interpretations of legal fiduciary duty. This work is done most effectively in partnership with market participants. Green-thinking investors can help.

STEVE WAYGOODChief responsible investment officerAviva InvestorsLondon

Estimating indirectly observable GDP in real time is indeed a hard job for macro-econometricians, or wonks, as you call us (Crisis measures, May 30th). Most of the components are either highly lagged, as your article mentioned, or altogether unobservable. But the textbook definition of GDP and its components wont be changing any time soon, as the reader is led to believe. Instead what has always and will continue to change are the proxy indicators used to estimate the estimate of GDP.

MICHAEL BOERMANWashington, DC

Reading Lexingtons account of his garden adventures (June 20th) brought back memories of my own experience with neighbours in Twinsburg, Ohio, in the late 1970s. They also objected to vegetables growing in our front yard (the only available space). We were doing it for the same reasons as Lexington: pleasure, fresh food to eat, and a learning experience for our young children. The neighbours, recently arrived into the suburban middle class, saw it as an affront. They no longer had to grow food for their table. They could buy it at the store and keep it in the deep freeze. Our garden, in their face every day, reminded them of their roots in Appalachian poverty. They called us hillbillies.

Arthur C. Clarke once wrote: Any sufficiently advanced technology is indistinguishable from magic. Our version read, Any sufficiently advanced lifestyle is indistinguishable from hillbillies.

PHILIP RAKITAPhiladelphia

Bartleby (May 30th) thinks the benefits of working from home will mean that employees will not want to return to the office. I am not sure that is the case for many people. My husband is lucky. He works for a company that already expected its staff to work remotely, so had the systems and habits in place. He has a spacious room to work in, with an adjustable chair, large monitor and a nice view. I do not work so he is not responsible for child care or home schooling.

Many people are working at makeshift workspaces which would make an occupational therapist cringe. Few will have a dedicated room for their home office, so their work invades their mental and physical space.

My husband has noticed that meetings are being set up both earlier and later in the day because there is an assumption that, as people are not commuting, it is fine to extend their work day. Colleagues book a half-hour meeting instead of dropping by someones desk to ask a quick question. Any benefit of not commuting is lost. My husband still struggles to finish in time to have dinner with our children. People with especially long commutes now have more time, but even that was a change of scenery and offered some incidental exercise.

JENNIFER ALLENLondon

As Bartleby pointed out, the impact of pandemic working conditions wont be limited to the current generation. By exacerbating these divides, will covid-19 completely guarantee a future dominated by the baby-Zoomers?

MALCOLM BEGGTokyo

The transition away from the physical office engenders a lackadaisical approach to the work day for many workers. It brings to mind Ignatius Reillys reasoning for his late start at the office from A Confederacy of Dunces:

I avoid that bleak first hour of the working day during which my still sluggish senses and body make every chore a penance. I find that in arriving later, the work which I do perform is of a much higher quality.

ROBERT MOGIELNICKIArlington, Virginia

This article appeared in the Letters section of the print edition under the headline "On artificial intelligence, green investing, GDP, gardens, working from home"

Original post:
Letters to the editor - The Economist

Machine learning finds use in creating sharper maps of ‘ecosystem’ lines in the ocean – Firstpost

EOSJul 01, 2020 14:54:08 IST

On land, its easy for us to see divisions between ecosystems: A rain forests fan palms and vines stand in stark relief to the cacti of a high desert. Without detailed data or scientific measurements, we can tell a distinct difference in the ecosystems flora and fauna.

But how do scientists draw those divisions in the ocean? A new paper proposes a tool to redraw the lines that define an oceans ecosystems, lines originally penned by the seagoing oceanographerAlan Longhurstin the 1990s. The paper uses unsupervised learning, a machine learning method, to analyze the complex interplay between plankton species and nutrient fluxes. As a result, the tool could give researchers a more flexible definition of ecosystem regions.

Using the tool on global modeling output suggests that the oceans surface has more than 100 different regions or as few as 12 if aggregated, simplifying the56 Longhurst regions. The research could complement ongoing efforts to improve fisheries management and satellite detection of shifting plankton under climate change. It could also direct researchers to more precise locations for field sampling.

A sea turtle in the aqua blue waters of Hawaii. Image: Rohit Tandon/Unsplash

Coccolithophores, diatoms, zooplankton, and other planktonic life-formsfloaton much of the oceans sunlit surface. Scientists monitor plankton with long-term sampling stations and peer at their colorsby satellitefrom above, but they dont have detailed maps of where plankton lives worldwide.

Models help fill the gaps in scientists knowledge, and the latest research relies on an ocean model to simulate where 51 types of plankton amass on the surface oceans worldwide. The latest research then applies the new classification tool, called the systematic aggregated ecoprovince (SAGE) method, to discern where neighborhoods of like-minded plankton and nutrients appear.

SAGE relies, in part, on a type of machine learning algorithm called unsupervised learning. The algorithms strength is that it searches for patterns unprompted by researchers.

To compare the tool to a simple example, if scientists told an algorithm to identify shapes in photographs like circles and squares, the researchers could supervise the process by telling the computer what a square and circle looked like before it began. But in unsupervised learning, the algorithm has no prior knowledge of shapes and will sift through many images to identify patterns of similar shapes itself.

Using an unsupervised approach gives SAGE the freedom to let patterns emerge that the scientists might not otherwise see.

While my human eyes cant see these different regions that stand out, the machine can, first author and physical oceanographerMaike Sonnewaldat Princeton University said. And thats where the power of this method comes in. This method could be used more broadly by geoscientists in other fields to make sense of nonlinear data, said Sonnewald.

A machine-learning technique developed at MIT combs through global ocean data to find commonalities between marine locations, based on how phytoplankton species interact with each other. Using this approach, researchers have determined that the ocean can be split into over 100 types of provinces, and 12 megaprovinces, that are distinct in their ecological makeup.

Applying SAGE to model data, the tool noted 115 distinct ecological provinces, which can then be boiled down into 12 overarching regions.

One region appears in the center of nutrient-poor ocean gyres, whereas other regions show productive ecosystems along the coast and equator.

You have regions that are kind of like the regions youd see on land, Sonnewald said.One area in the heart of a desert-like region of the ocean is characterized by very small cells. Theres just not a lot of plankton biomass. The region that includes Perus fertile coast, however, has a huge amount of stuff.

If scientists want more distinctions between communities, they can adjust the tool to see the full 115 regions. But having only 12 regions can be powerful too, said Sonnewald, because it demonstrates the similarities between the different [ocean] basins. The tool was published in arecent paperin the journalScience Advances.

OceanographerFrancois Ribaletat the University of Washington, who was not involved in the study, hopes to apply the tool to field data when he takes measurements on research cruises. He said identifying unique provinces gives scientists a hint of how ecosystems could react to changing ocean conditions.

If we identify that an organism is very sensitive to temperature, so then we can start to actually make some predictions, Ribalet said. Using the tool will help him tease out an ecosystems key drivers and how it may react to future ocean warming.

Jenessa Duncombe.Text 2020. AGU.

This story has been republished from Eosunder the Creative Commons 3.0 license.Read theoriginal story.

Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.

Read more:
Machine learning finds use in creating sharper maps of 'ecosystem' lines in the ocean - Firstpost

Solving problems by working together: Could quantum computing hold the key to Covid-19? – ITProPortal

Given the enormous potential for quantum computing to change the way we forecast, model and understand the world, many are beginning to question whether it could have helped to better prepare us all for a global pandemic such as the Covid-19 crisis. Governments, organisations and the public are continuing the quest for answers about when this crisis will end and how we can find a way out of the current state of lockdown, and we are all continuing to learn through incremental and experimental steps. It certainly seems plausible that the high compute simulation capabilities of our most revolutionary technology could hold some of the answers and enable us to respond in a more coherent and impactful way.

Big investments have already been made in quantum computing, as countries and companies battle to create the first quantum supercomputer, so they can harness the power of this awesome technology. The World Economic Forum has also recognised the important role that this technology will play in our future, and has a dedicated Global Future Council to drive collaboration between public and private sector organisations engaged in the development of Quantum Computing. Although its unlikely to result in any overnight miracles, its understandable that many are thinking about whether these huge efforts and investments can be turned towards the mutual challenge we face in finding a solution to the Covid-19 pandemic.

There are already some ground-breaking use-cases for quantum computing within the healthcare industry. Where in the past some scientific breakthroughs such as the discovery of penicillin came completely by accident, quantum computing puts scientists in a much stronger position to find what they were looking for, faster. Quantum raises capacity to such a high degree that it would be possible to model penicillin using just a third of the processing power a classical computer would require to do the job meaning it can do more with less, at greater speed.

In the battle against Covid-19, the US Department of Energys Oak Ridge National Laboratory (ORNL) is already using quantum supercomputers in its search for drug compounds that can treat the disease. IBM has also been using quantum supercomputers to run simulations on thousands of compounds to try and identify which of them is most likely to attach to the spike that Covid-19 uses to inject genetic material into healthy cells, and thereby prevent it. It has already emerged with 77 promising drugs that are worth further investigation and development progress that would have taken years if traditional computing power had been used.

Other businesses are likely to be keen to follow in the footsteps of these examples, and play their own part in dealing with the crisis, but to date its only been the worlds largest organisations that have been using quantum power. At present, many businesses simply dont have the skills and resources needed to fabricate, verify, architect and launch a large-scale quantum computer on their own.

It will be easier to overcome these barriers, and enable more organisations to start getting to work with quantum computing, if they open themselves up to collaboration with partners, rather than trying to go it alone. Instead of locking away their secrets, businesses must be willing to work within an open ecosystem; finding mutually beneficial partnerships will make it much more realistic to drive things forward.

The tech giants have made a lot of early progress with quantum, and partnering with them could prove extremely valuable. Google, for example, claims to have developed a machine that can solve a problem in 200 seconds that would take the worlds fastest supercomputer 10,000 years imagine adding that kind of firepower to your computing arsenal. Google, IBM and Microsoft have already got the ball rolling by creating their own quantum partner networks. IBM Q and Microsoft Quantum Network bring together start-ups, universities, research labs, and Fortune 500 companies, enabling them to enjoy the benefits of exploring and learning together. The Google AI quantum initiative brings together strong academia support along with start-up collaboration on open source frameworks and tools in their lab. Collaborating in this manner, businesses can potentially play their own part in solving the Covid-19 crisis, or preventing future pandemics from doing as much damage.

Those that are leading the way in quantum computing are taking a collaborative approach, acknowledging that no one organisation holds all the answers or all the best ideas. This approach will prove particularly beneficial as we search for a solution to the Covid-19 crisis: its in everyones interests to find an exit to the global shutdown and build knowledge that means we are better-prepared for future outbreaks.

Looking at the bigger picture, despite all the progress that is being made with quantum, traditional computing will still have an important role to play in the short to medium term. Strategically, it makes sense to have quantum as the exploratory left side of the brain, while traditional systems remain in place for key business-as-usual functions. If they can think about quantum-related work in this manner, businesses should begin to feel more comfortable making discoveries and breakthroughs together. This will allow them to speed up the time to market so that ideas can be explored, and new ground broken, much faster than ever before and thats exactly what the world needs right now.

Kalyan Kumar, CVP & CTO, IT Services, HCL Technologies

Go here to read the rest:
Solving problems by working together: Could quantum computing hold the key to Covid-19? - ITProPortal

Menten AIs combination of buzzword bingo brings AI and quantum computing to drug discovery – TechCrunch

Menten AI has an impressive founding team and a pitch that combines some of the hottest trends in tech to pursue one of the biggest problems in healthcare new drug discovery. The company is also $4 million richer with a seed investment from firms including Uncork Capital and Khosla Ventures to build out its business.

Menten AIs pitch to investors was the combination of quantum computing and machine learning to discover new drugs that sit between small molecules and large biologics, according to the companys co-founder Hans Melo.

A graduate of the Y Combinator accelerator, which also participated in the round alongside Social Impact Capital*, Menten AI looks to design proteins from scratch. Its a heavier lift than some might expect, because, as Melo said in an interview, it takes a lot of work to make an actual drug.

Menten AI is working with peptides, which are strings of amino acid chains similar to proteins that have the potential to slow aging, reduce inflammation and get rid of pathogens in the body.

As a drug modality [peptides] are quite new, says Melo. Until recently it was really hard to design them computationally and people tried to focus on genetically modifying them.

Peptides have the benefit of getting through membranes and into cells where they can combine with targets that are too large for small molecules, according to Melo.

Most drug targets are not addressable with either small molecules or biologics, according to Melo, which means theres a huge untapped potential market for peptide therapies.

Menten AI is already working on a COVID-19 therapeutic, although the companys young chief executive declined to disclose too many details about it. Another area of interest is in neurological disorders, where the founding team members have some expertise.

Image of peptide molecules. Image Courtesy: D-Wave

While Menten AIs targets are interesting, the approach that the company is taking, using quantum computing to potentially drive down the cost and accelerate the time to market, is equally compelling for investors.

Its also unproven. Right now, there isnt a quantum advantage to using the novel computing technology versus traditional computing. Something that Melo freely admits.

Were not claiming a quantum advantage, but were not claiming a quantum disadvantage, is the way the young entrepreneur puts it. We have come up with a different way of solving the problem that may scale better. We havent proven an advantage.

Still, the company is an early indicator of the kinds of services quantum computing could offer, and its with that in mind that Menten AI partnered with some of the leading independent quantum computing companies, D-Wave and Rigetti Computing, to work on applications of their technology.

The emphasis on quantum computing also differentiates it from larger publicly traded competitors like Schrdinger and Codexis.

So does the pedigree of its founding team, according to Uncork Capital investor, Jeff Clavier. Its really the unique team that they formed, Clavier said of his decision to invest in the early-stage company. Theres Hans the CEO who is more on the quantum side; theres Tamas [Gorbe] on the bio side and theres Vikram [Mulligan] who developed the research. Its kind of a unique fantastic team that came together to work on the opportunity.

Clavier has also acknowledged the possibility that it might not work.

Can they really produce anything interesting at the end? he asked. Its still an early-stage company and we may fall flat on our face or they may come up with really new ways to make new peptides.

Its probably not a bad idea to take a bet on Melo, who worked with Mulligan, a researcher from the Flatiron Institute focused on computational biology, to produce some of the early research into the creation of new peptides using D-Waves quantum computing.

Novel peptide structures created using D-Waves quantum computers. Image Courtesy: D-Wave

While Melo and Mulligan were the initial researchers working on the technology that would become Menten AI, Gorbe was added to the founding team to get the company some exposure into the world of chemistry and enzymatic applications for its new virtual protein manufacturing technology.

The gamble paid off in the form of pilot projects (also undisclosed) that focus on the development of enzymes for agricultural applications and pharmaceuticals.

At the end of the day what theyre doing is theyre using advanced computing to figure out what is the optimal placement of those clinical compounds in a way that is less based on those sensitive tests and more bound on those theories, said Clavier.

*This post was updated to add that Social Impact Capital invested in the round. Khosla, Social Impact, and Uncork each invested $1 million into Menten AI.

Visit link:
Menten AIs combination of buzzword bingo brings AI and quantum computing to drug discovery - TechCrunch

Better encryption for wireless privacy at the dawn of quantum computing – UC Riverside

For the widest possible and mobile Internet coverage, wireless communications are essential. But due to the open nature of wireless transmissions, information security is a unique issue of challenge. The widely deployed methods for information security are based on digital encryption, which in turn requires two or more legitimate parties to share a secret key.

The distribution of a secrecy key via zero-distance physical contact is inconvenient in general and impossible in situations where too little time is available. The conventional solution to this challenge is to use the public-key infrastructure, or PKI, for secret key distribution. Yet, PKI is based on computational hardness of factoring, for example, which is known to be increasingly threatened by quantum computing. Some predictions suggest that such a threat could become a reality within 15 years.

In order to provide Internet coverage for every possible spot on the planet, such as remote islands and mountains, a low-orbiting satellite communication network is rapidly being developed. A satellite can transmit or receive streams of digital information to or from terrestrial stations. But the geographical exposure of these streams is large and easily prone to eavesdropping. For applications such as satellite communications, how can we guarantee information security even if quantum computers become readily available in the near future?

Yingbo Huas Lab of Signals, Systems and Networks in the Department of Electrical and Computer Engineering, which has been supported in part by Army, has aimed to develop reliable and secure transmission, or RESET, schemes for future wireless networks. RESET guarantees that the secret information is not only received reliably by legitimate receiver but also secure from eavesdropper with any channel superiority.

In particular, Huas Lab has developed a physical layer encryption method that could be immune to the threat of quantum computing. They are actively engaged in further research of this and other related methods.

For the physical layer encryption proposed by Huas lab, only partial information is extracted from randomized matrices such as the principal singular vector of each matrix modulated by secret physical feature approximately shared by legitimate parties. The principal singular vector of a matrix is not a reversible function of the matrix. This seems to suggest that a quantum computer is unable to perform a task that is rather easy on a classical computer. If this is true, then the physical layer encryption should be immune from attacks via quantum computing. Unlike the number theory based encryption methods which are vulnerable to quantum attacks, Huas physical layer encryption is based on continuous encryption functions that are still yet to be developed.

Read the original here:
Better encryption for wireless privacy at the dawn of quantum computing - UC Riverside

Quantum Computing gains as it kicks off commercialization phase with its Mukai quantum computing software – Proactive Investors USA & Canada

Set up at the beginning of 2018, Quantum bills itself as the first publicly traded pure-play quantum computing company

QuantumComputing Inc (), an advanced technology company developing quantum-ready applications and tools,said Wednesday that it is set to gain as it has entered the key commercialization phase as the only public pure-play in the quantum computing space.

The Leesburg, Virginia-based company has kicked off the official commercial launch of its Mukai quantum computing software execution platform. Last week, the company introduced a new trial access program that demonstrates Mukais power to solve real-world problems.

Quantums stock recently traded 1.3%higher to $3.91 a share inNew York.

READ:Quantum Computing launches free trial of Mukai quantum computing application platform

According to the company, the trial will enable developers to discover how they can migrate their existing applications to quantum-ready solutions and realize superior performance even when running their solutions on classical Intel or AMD processor-based computers.

The trial is designed to encourage and facilitate quantum application development to solve real world problems at breakthrough speed and not tomorrow, but today, the company said in a statement.

There are only a handful of quantum software experts in the world, and fortunately for us, this includes Mike and Steve," commented Quantum CEO Robert Liscouski. They have been doing an outstanding job building out our software engineering teams, developing our first quantum-ready products, and preparing QCI for commercial success.

Quantum kicked off 2020 with the public release of its first quantum-ready software product, the QCI Quantum Asset Allocator (QAA). This solution is designed to help portfolio managers maximize returns by calculating their optimal asset allocations, said the company.

QAA is the first of a series of Quantum products that will leverage quantum techniques to provide differentiated performance on both classical computers and on a variety of early-stage quantum computers, added the company. Naturally, Quantum is looking to convert its QAA beta users into long-term customers.

The core of our strategy has been to anticipate the direction of the market and be ahead of it by offering unique solutions that establish QCI as a market leader, said Liscouski. We will be driven by the market, but in turn will drive the market by helping our customers realize their quantum-enabled future.

The company said that while quantum computing is typically a high-dollar investment given the "sophisticated and costly hardware," Mukai makes quantum application development affordable and scalable compared to running solutions on intermediate quantum computers, like those offered by D-Wave, Fujitsu, IBM and Rigetti.

Mukai addresses the quantum computing market which is tipped to grow at a 23.2% compound annual growth rate to $9.1 billion by 2030, according to Tractica.

Contact the author Uttara Choudhury at [emailprotected]

Follow her on Twitter: @UttaraProactive

Link:
Quantum Computing gains as it kicks off commercialization phase with its Mukai quantum computing software - Proactive Investors USA & Canada

Quantum Computing Market: In-Depth Market Research and Trends Analysis till 2030 – Cole of Duty

Prophecy Market Insights Quantum Computing market research report provides a comprehensive, 360-degree analysis of the targeted market which helps stakeholders to identify the opportunities as well as challenges during COVID-19 pandemic across the globe.

Quantum Computing Devices Market reports provide in-depth analysis of Top Players, Geography, End users, Applications, Competitor analysis, Revenue, Financial Analysis, Market Share, COVID-19 Analysis, Trends and Forecast 2020-2029. It incorporates market evolution study, involving the current scenario, growth rate, and capacity inflation prospects, based on Porters Five Forces and DROT analyses.

Get Sample Copy of This Report @ https://www.prophecymarketinsights.com/market_insight/Insight/request-sample/571

An executive summary provides the markets definition, application, overview, classifications, product specifications, manufacturing processes; raw materials, and cost structures.

Market Dynamics offers drivers, restraints, challenges, trends, and opportunities of the Quantum Computing market

Detailed analysis of the COVID-19 impact will be given in the report, as our analyst and research associates are working hard to understand the impact of COVID-19 disaster on many corporations, sectors and help our clients in taking excellent business decisions. We acknowledge everyone who is doing their part in this financial and healthcare crisis.

Segment Level Analysis in terms of types, product, geography, demography, etc. along with market size forecast

Segmentation Overview:

The Quantum Computing research study comprises 100+ market data Tables, Graphs & Figures, Pie Chat to understand detailed analysis of the market. The predictions estimated in the market report have been resulted in using proven research techniques, methodologies, and assumptions. This Quantum Computing market report states the market overview, historical data along with size, growth, share, demand, and revenue of the global industry.

Request Discount @ https://www.prophecymarketinsights.com/market_insight/Insight/request-discount/571

Regional and Country- level Analysis different geographical areas are studied deeply and an economical scenario has been offered to support new entrants, leading market players, and investors to regulate emerging economies. The top producers and consumers focus on production, product capacity, value, consumption, growth opportunity, and market share in these key regions, covering

The comprehensive list of Key Market Players along with their market overview, product protocol, key highlights, key financial issues, SWOT analysis, and business strategies. The report dedicatedly offers helpful solutions for players to increase their clients on a global scale and expand their favour significantly over the forecast period. The report also serves strategic decision-making solutions for the clients.

Competitive landscape Analysis provides mergers and acquisitions, collaborations along with new product launches, heat map analysis, and market presence and specificity analysis.

Quantum ComputingMarket Key Players:

Wave Systems Corp, 1QB Information Technologies Inc, QC Ware, Corp, Google Inc, QxBranch LLC, Microsoft Corporation, International Business Machines Corporation, Huawei Technologies Co., Ltd, ID Quantique SA, and Atos SE.

The study analyses the manufacturing and processing requirements, project funding, project cost, project economics, profit margins, predicted returns on investment, etc. With the tables and figures, the report provides key statistics on the state of the industry and is a valuable source of guidance and direction for companies and individuals interested in the market.

Stakeholders Benefit:

About us:

Prophecy Market Insights is specialized market research, analytics, marketing/business strategy, and solutions that offers strategic and tactical support to clients for making well-informed business decisions and to identify and achieve high-value opportunities in the target business area. We also help our clients to address business challenges and provide the best possible solutions to overcome them and transform their business.

Contact Us:

Mr Alex (Sales Manager)

Prophecy Market Insights

Phone: +1 860 531 2701

Email: [emailprotected]

Read this article:
Quantum Computing Market: In-Depth Market Research and Trends Analysis till 2030 - Cole of Duty

Is quantum computing ready to leap into the real world? – ARNnet

Market research firm IDC predicts that by 2023, 25% of Fortune 500 companies will gain a competitive advantage from quantum computing.

Its a bold prediction given the current dearth of real-world examples of quantum computing in action. However, theres plenty of industry activity to back up IDCs forecast. In fact, early this year at the Consumer Electronics Show the biggest buzz wasnt the newest smartphone, wearable device or autonomous-driving technology, but rather unprecedented computing power based on an area of quantum physics Albert Einstein described as "spooky action at a distance."

While quantum computing hasnt yet factored into solving worldwide problems such as the coronavirus pandemic, that is exactly the type of problem quantum has the potential to address. That potential will turn into a reality, according IBM, one of a handful of tech giants leading the quantum charge. This is the decade that quantum computing gets real, says Katie Pizzolato, director at IBM QStart.

For that reason, Pizzolato said, it was important to keep quantum public-facing rather than keep it a technology buried in research facilities. We wanted to get quantum out of the labs and into the real world, she said in reference to IBMs strong presence at CES.

Companies such as Google, Microsoft, D-Wave and Regetti are also eager to move quantum forward, and based on IDCs recent report Quantum Computing Adoption Trends: 2020 Survey Findings, the technology is building momentum.

According to responses from 520 IT and line-of-business professionals, quantum computing budgets and implementations will increase in the next 18-24 months. Half of all respondents to the IDC survey reported that funds allocated for quantum computing accounted for just 0-2% of the annual IT infrastructure in 2019, but will account for 7-10% in the next 24 months. For companies with more than 10,000 employees, the spending increase is more dramatic more than half of respondents will spend between 9% and 14% on quantum technology over the next two years.

Respondents to the IDC survey were clear where they are focusing their attention: 65% of respondents are using to plan to use cloud-based quantum computing, followed by 45% who use or plan to use quantum algorithms (which includes simulators, optimizations, artificial intelligence, machine learning and deep learning). Quantum networks (44%), hybrid quantum computing (40%) and quantum cryptography (33%) round the top five, according to the IDC survey.

Heather West, IDC senior research analyst, Infrastructure Systems, Platforms and Technology and one of the reports authors, says that quantum computing excels at solving large problems where theres so much data. The initial areas of focus will be AI, business intelligence and overall productivity and efficiency, according to the IDC report.

Very few companies have actually operationalized [quantum computing]. The skillsets are so advanced, and few people really understand quantum, West said, adding that were still at the experimentation stage with algorithms as companies also look to overcome challenges such as cost, security and data transfers between vendors. West points out, however, that there are already practical use cases in areas such as manufacturing and finance.

Right now, West says, the focus is on how to optimize processes. However, in the future, quantum will be applied to larger problems such as how to address climate change and cure diseases.

As IDCs West says, quantum computing isnt without its challenges. IDC cites complex technology, skillset limitations, a lack of available resources, cost, security, data transfer among vendors as barriers to adoption. With so many challenges, its not surprising that when selecting vendors to support quantum technology initiatives big names dominate the responses in the IDC survey. Google tops the list with 37% of respondents citing it as the vendor of choice, followed by Microsoft with 32%, IBM with 27% and Intel with 23&.

What makes quantum computing more powerful than classical computing is that rather relying on binary bits (i.e, either a 1 or 0) quantum computing uses qubits. Qubits can process more data because they can exist in many possible combinations of 1 and 0 simultaneously, known as superposition, processing an enormous number of outcomes.

In addition to superposition, pairs of qubits can be "entangled." This entanglement is what makes quantum computers as powerful as they are. What make it even more intriguing is that no one knows how or why it works, prompting that spooky action description from Einstein.

In classical computing, doubling the amount of bits gives you, as youd expect, twice the computing power. However, thanks to entanglement adding more qubits gives you exponentially more processing power.

If processing power potential is the good news on qubits, their fragile nature is the bad news. Not all qubits are created equal, IBMs Pizzolato says. Qubits are unpredictable and susceptible to environmental noise and errors. After an error they fall back to a binary state of 1 or 0, so the longer the calculation runs without an error, the greater the calculation. The goal is to protect against errors to solve the most challenging problems, Pizzolato says.

How common are these errors? A slight fluctuation in temperature or vibration can cause whats known as "decoherence." And, once a qubit is in decoherence, its calculation has failed and must be run again. For that reason, quantum computers are housed in environments of near absolute zero and with little outside disruption.

More qubits help. The 50 qubits range is when you start to supersede what you can achieve on a supercomputer, says Pizzolato. IBM last fall announced its 14th quantum computer, a 53-qubit system. Its previous quantum computers were 20 qubits. However, quantum is more than qubits. Hardware is at the center of the circle, but then you have the algorithms and the applications, says Pizzolato. More sophisticated algorithms are critical to quantum computings real-world success. Quantum is all about the algorithms you can run and the complexity of those algorithms, she says.

Skills gaps are a challenge for IT in general. With quantum computing, its magnified. Where will the quantum development come from? Peter Rutten, research director and one of the authors of the IDC report, says that the algorithms and application development will come from three distinct personas:

Developers who are intrigued with quantum computing, developers with a physics background (because there are not many jobs in physics) and those working in high-performance-computing operations. Its a seamless transition from HPC algorithms to quantum, Rutten says.

On the one hand, Google, IBM and others appear to be jostling for position in achieving quantum advantage (the point at which quantum computing can solve a program faster than classical computing) and quantum supremacy (when quantum computing solves a program that no conventional computer can solve). In fact, IBM recently publicly refuted Googles claim of achieving quantum supremacy with its 53-qubit computer, its researchers saying that Google failed to fully estimate the resources of a supercomputer, publishing this in an IBM Research blog last October:

Building quantum systems is a feat of science and engineering, and benchmarking them is a formidable challenge," according to an IBM quantum-computing blog. "Googles experiment is an excellent demonstration of the progress in superconducting-based quantum computing, showing state-of-the-art gate fidelities on a 53-qubit device, but it should not be viewed as proof that quantum computers are supreme over classical computers.

On the other hand, despite the top-tier vendors seemingly jockeying for quantum positions, IDGs Rutten said, its not about competitors going head-to-head. Its hard to compare. No one can tell you [whos ahead] because they are measuring progress in different ways, he says. The notion of quantum being a race is silly.

IDCs West concurs, saying that quantum advances will come from the developer community and technology partnerships. Its not so much a race to the end, because there may not be just one answer.

For its part, IBM has a network of 100 partnerships from commercial (e.g, Goldman Sachs, ExxonMobile, Accenture and others), academic (e.g., MIT, Virginia Tech, Johns Hopkins and dozens of others), startups, government and research sectors.

Even with the likes of Google, IBM and Microsoft pushing quantum computing to go from advantage to supremacy, no one knows where the big innovation will come from, Pizzolato says. The MVP is probably a guy in a lab.

Error: Please check your email address.

Read more here:
Is quantum computing ready to leap into the real world? - ARNnet

Spain Introduces the World’s First Quantum Phase Battery – News – All About Circuits

By now, were no stranger to the quantum computing hype. When (or rather, if) they are successfully developed and deliver on their promised potential, quantum computers will be able to solve problems and challenges that would otherwise require hundreds or thousands or more years for current classic computer technology to solve.

In what could be a massive step for quantum computing, researchers from the University of the Basque County claim to have developed the worlds first quantum phase battery.

Today, batteries are ubiquitous, with lithium-ion batteries being the most common out of them, although alternatives do exist. These batteries convert chemical energy into a voltage that can provide power to an electronic circuit.

In contrast, quantum technologies feature circuits based on superconducting materials through which a current can flow without voltage, therefore negating the need for classic chemical batteries. In quantum technologies, the current is induced from a phase difference of the wave function of the quantum circuit related to the wave nature of matter.

A quantum device that can provide a persistent phase difference can be used as a quantum phase battery and induce supercurrents in a quantum circuit, powering it.

This is what the researchers set out to achievecreating such a quantum devicebuilding on an idea first conceived in 2015 by Sebastian Bergeret from the Mesoscopic physics group at the Materials Physics Center. Along with Francesco Giazotto and Elia Strambini from the NEST-CNR Institute, Pisa claims to have built the worlds first functional quantum phase battery.

Bergeret and Tokatlys idea, in short, involves a combination of superconducting and magnetic materials with an intrinsic relativistic effect known as spin-orbit coupling. On top of this idea, Giazotto and Strambini identified a suitable material combination that allowed them to fabricate their quantum phase battery.

Their quantum phase battery consists of an n-doped indium arsenide (InAs) nanowire, which forms the core of the cell, also known as the pile, and aluminum superconducting leads act as poles. The battery is charged by applying an external magnetic field, which can then be turned off.

If quantum batteries are ever to be realized, they could bring significant benefits over their chemical cousins. Among other things, quantum batteries could offer vastly better thermodynamic efficiency and ultra-fast charging times, making them perfect for next-gen applications like electric vehicles.

See the article here:
Spain Introduces the World's First Quantum Phase Battery - News - All About Circuits

Physicists Just Quantum Teleported Information Between Particles of Matter – ScienceAlert

By making use of the 'spooky' laws behind quantum entanglement, physicists think have found a way to make information leap between a pair of electrons separated by distance.

Teleporting fundamental states between photonsmassless particles of light is quickly becoming old news, a trick we are still learning to exploit in computing and encrypted communications technology.

But what the latest research has achieved is quantum teleportation between particles of matter electrons something that could help connect quantum computing with the more traditional electronic kind.

"We provide evidence for 'entanglement swapping,' in which we create entanglement between two electrons even though the particles never interact, and 'quantum gate teleportation,' a potentially useful technique for quantum computing using teleportation," says physicist John Nichol from the University of Rochester in New York.

"Our work shows that this can be done even without photons."

Entanglement is physics jargon for what seems like a pretty straightforward concept.

If you buy a pair of shoes from a shop and leave one behind, you'll automatically know which foot it belongs to the moment you get home. The shoes are in a manner of speaking entangled.

If the shopkeeper randomly pulls out its matching partner when you return, you'll think they either remembered your sale, made a lucky guess, or were perhaps a little 'spooky' in their prediction.

The real weirdness arises when we imagine your lonely shoe as being both left and right at the same time, at least until you look at it. At that very moment, the shoe's partner back at the shop also snaps into shape, as if your sneaky peek teleported across that distance.

It's a kind of serendipitous exchange that Einstein felt was a little too spooky for comfort. Nearly a century after physicists raised the possibility, we now know teleportation between entangled particles is how the Universe works on a fundamental level.

While it's not exactly a Star Trek-type teleportation that could beam whole objects across space, the mathematics describing this information jump are mighty useful in carrying out special kinds of calculations in computing.

Typical computer logic is made up of a binary language of bits, labelled either 1s and 0s. Quantum computing is built with qubits that can occupy both states at once providing far greater possibilities that classical technology can't touch.

The problem is the Universe is like a big jumble of shoes, all threatening to turn your delicate game of 'guess which foot' into a nightmare gamble the moment any qubit interacts with its environment.

Manipulating photons to transmit their entangled states is made easier thanks to the fact they can be quickly separated at light speed over huge distances through a vacuum or down an optical fibre.

But separating entangled masses such as pairs of electrons is more of a challenge, given their clunky interactions as they bounce along are almost certain to ruin their mathematically pure quantum state.

It's a challenge well worth the effort, though.

"Individual electrons are promising qubits because they interact very easily with each other, and individual electron qubits in semiconductors are also scalable," saysNichol.

"Reliably creating long-distance interactions between electrons is essential for quantum computing."

To achieve it, the team of physicists and engineers took advantage of some strange fine print in the laws that govern the ways the fundamental particles making up atoms and molecules hold their place.

Any two electrons that share the same quantum spin state can't occupy the same spot in space. But there is a bit of a loophole that says nearby electrons can swap their spins, almost as if your feet could swap shoes if you bring them close enough.

The researchers had previously shownthat this exchange can be manipulated without needing to move the electrons at all, presenting a potential method for teleportation.

This latest advance helps bring the process closer to technological reality, overcoming hurdles that would connect quantum weirdness with existing computing technology.

"We provide evidence for 'entanglement swapping,' in which we create entanglement between two electrons even though the particles never interact, and 'quantum gate teleportation,' a potentially useful technique for quantum computing using teleportation," says Nichol.

"Our work shows that this can be done even without photons."

Of course, we're still some way off replacing photons with electrons for this kind of quantum information transfer. The researchers haven't gone as far as measuring the states of electrons themselves, meaning there could still be all kinds of interference to iron out.

But having strong evidence of the possibility of teleportation between electrons is an encouraging sign of the possibilities open to future engineers.

This research was published in Nature Communications.

Read more from the original source:
Physicists Just Quantum Teleported Information Between Particles of Matter - ScienceAlert

Quantum Computing Market Size, Analysis, Trends and Segmented Data by Top Companies and Opportunities 2020-2027 – Cole of Duty

New Jersey, United States,- The latest research study on Quantum Computing Market Added by Verified Market Research, offers details on current and future growth trends pertaining to the business besides information on myriad regions across the geographical landscape of the Quantum Computing market. The report also expands on comprehensive details regarding the supply and demand analysis, participation by major industry players and market share growth statistics of the business sphere.

Global Quantum Computing Market was valued at USD 89.35 million in 2016 and is projected to reach USD 948.82 million by 2025, growing at a CAGR of 30.02% from 2017 to 2025.

Download Sample Copy of Quantum Computing Market Report Study 2020-2027 @ https://www.verifiedmarketresearch.com/download-sample/?rid=24845&utm_source=COD&utm_medium=007

The research report on the Quantum Computing market provides a granular assessment of this business vertical and includes information concerning the market tendencies such as revenue estimations, current remuneration, market valuation, and market size over the estimated timeframe.

Major Players Covered in this Report are:

The research report is broken down into chapters, which are introduced by the executive summary. Its the introductory part of the chapter, which includes details about global market figures, both historical and estimates. The executive summary also provides a brief about the segments and the reasons for the progress or decline during the forecast period. The insightful research report on the global Quantum Computing market includes Porters five forces analysis and SWOT analysis to understand the factors impacting consumer and supplier behavior.

The scope of the Report:

The report segments the global Quantum Computing market on the basis of application, type, service, technology, and region. Each chapter under this segmentation allows readers to grasp the nitty-gritty of the market. A magnified look at the segment-based analysis is aimed at giving the readers a closer look at the opportunities and threats in the market. It also addresses political scenarios that are expected to impact the market in both small and big ways. The report on the global Quantum Computing market examines changing regulatory scenarios to make accurate projections about potential investments. It also evaluates the risk for new entrants and the intensity of the competitive rivalry.

Ask for Discount on Quantum Computing Market Report @ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=24845&utm_source=COD&utm_medium=007

As per the regional scope of the Quantum Computing market:

Highlights of the report:

Key Questions Answered in the report:

Learn More about this report @ https://www.verifiedmarketresearch.com/product/Quantum-Computing-Market/?utm_source=COD&utm_medium=007

About us:

Verified Market Research is a leading Global Research and Consulting firm servicing over 5000+ customers. Verified Market Research provides advanced analytical research solutions while offering information enriched research studies. We offer insight into strategic and growth analyses, Data necessary to achieve corporate goals, and critical revenue decisions.

Our 250 Analysts and SMEs offer a high level of expertise in data collection and governance use industrial techniques to collect and analyze data on more than 15,000 high impact and niche markets. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise, and years of collective experience to produce informative and accurate research.

Contact us:

Mr. Edwyne Fernandes

US: +1 (650)-781-4080UK: +44 (203)-411-9686APAC: +91 (902)-863-5784US Toll-Free: +1 (800)-7821768

Email: [emailprotected]

Our Trending Reports

Rugged Display Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Quantum Computing Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Sensor Patch Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Data Center Interconnect Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Industrial Lighting Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

See the original post here:
Quantum Computing Market Size, Analysis, Trends and Segmented Data by Top Companies and Opportunities 2020-2027 - Cole of Duty

Why Indian IT Professionals Are Looking To Upskill Themselves In Cloud Computing – Analytics India Magazine

This internet boom over the last two has led to the growth in the demand for bandwidth from data centres. Lack of access to quality data networks and fully amped data centres in India is truly felt among organisations during this novel Coronavirus pandemic. To counter the pressure of work from home scenarios, Indian organisations are investing many times more than traditional IT spending in cloud infrastructure. In addition, cloud companies are expanding fast in the nation.

Even global companies are flocking towards India. For instance, Oracle has come up with its second cloud region in Hyderabad to support customers demand for enterprise cloud services in India. The launch follows the launch of its Mumbai Cloud region in 2019, making India Oracles latest nation with multiple cloud regions available. India has also become the next big hot market for internet giants such as Netflix, Spotify, Facebook and Amazon, fuelling demand for cloud professionals who could manage the digital infrastructure.

Amid this boom, training providers are witnessing a surge in enrolment in their information technology training programmes, including emerging technologies like cloud and data science. The jobs of the future will need expertise specific niche skills, and upskilling is the only way for a long term career growth for technologists. Hence certification programs are getting popularity among the IT professionals. According to analysts, COVID-19 lockdown has catalysed the enthusiasm of techies to getting certified. Indian software programmers are going for cloud certifications amidst COVID-19 lockdown, revealed a survey report from TechGig.

Also Read: 10 Leading Courses & Training Programmes For Cloud Computing In India

Extensive understanding of a new-age technology appeared the most crucial reason for techies to take certifications. Also, freshers and new joiners are more interested in acquiring certifications than working professionals. Cloud technology which is helping communication and remote working amid the present COVID-19 lockdown is also the preferred option for upskilling for the Indian developers, notes TechGig. The preference for cloud came on top of other advanced technologies like artificial intelligence and machine learning.

In todays unique COVID-19 time, technology is the only string which is keeping the world together. From cloud computing, which is supporting work-from-home to artificial intelligence, which is backing banking, retail, and important sectors run operations. Besides, cloud computing is crucial for robotics that is helping the front-line hospital personnel; new-age technologies are assisting the globe to connect in the existing time. The TechGig survey shows the enthusiasm of Indian developers to upskill on these new-age technologies, said Sanjay Goyal, Vice President & Head of Product and Technology at TechGig

Looking for people with cloud skills is a complex endeavour. Organisations these days are finding it very difficult to hire and retain cloud specialists, particularly in roles requiring advanced cloud skills and cloud architecture. Therefore, companies are giving due importance to both finding and creating the skills in-house so they do not face infrastructure challenges. Also, given the introduction of new services from the three major cloud platforms Google Cloud Platform, Amazon Web Services, Microsoft Azure and others, cloud training has to be constant so people can stay on top of the technology. Training providers are witnessing a surge in enrolment in their information technology training programmes, including emerging technologies like cloud and data science.

Cloud technology is one of the leading tech domains for upskilling among the techies and other technologies like artificial intelligence, machine learning, and quantum computing, getting the highest preference in terms of the need for upskilling. According to TechGig IT Certification Survey, one of the most important findings was that 90% of the respondents revealed that they are planning to have an IT certification soon to support and boost their career prospects. Thats why the adoption of certification courses is on the rise.

Also Read: 10 Leading Courses & Training Programmes For Cloud Computing In India

Cloud computing has risen to be the most sought-after skills set in the world for the last few years, and in particular, in 2020, companies are migrating their infrastructure and apps to cloud platforms. As a consequence, cloud jobs are also growing at a swift pace, making it one of the hottest fields in information technology. Now, with the demand for cloud experts, it has fuelled the need for niche skills, and IT professionals know that well.

It is clear that IT professionals will not face any issue with employment opportunities if they are skilled in cloud technologies space, particularly for platforms such as AWS, Google Cloud, Microsoft Azure. Consequently, learners are developing skills so they can grab the jobs as a cloud developer/administrator or system operators for cloud platforms after finishing their training programs. The platforms are utilised by thousands and thousands of businesses worldwide for hosting their products and services.

Home Why Indian IT Professionals Are Looking To Upskill Themselves In Cloud Computing

Cloud training courses will provide professionals with the opportunity to learn the best techniques and practices in cloud computing and acquire live feedback from an expert instructor. Training will help learners to take cloud certification exams from vendors- AWS, Azure or Oracle certifications to get recognised by hiring managers.

The upskilling is spread across advanced classroom training programs run by specialised institutes like Jigsaw Academy and Great Learning, etc, which have also witnessed a surge in demand for enrollment. Apart from training institutes, learners are also flocking to cheaper and/or free courses from cloud vendors or those found on Udemy. In fact, in a recent survey done by Analytics India Magazine, 76.9% of the analytics professionals are spending their time on training through self-learning.

While IT professionals not already working with cloud technologies will gain a solid foundation, those with some cloud experience will gain a more structured and hands-on understanding of cloud technologies, including issues such as migration, deployment, integration, platform choice, and architecture.

According to reports, COVID-19 pandemic has caused the desire to get certified, and professionals understand that certification is a need of the hour amid mass layoffs. DevOps, infrastructure-as-a-service, software-as-a-service, automation, agile and software-defined networks are going to be critical for IT professionals to land these jobs. Some platforms are offering interesting courses for learners to build their cloud tech skills, including many free courses to build cloud tech skills.

Also Read: 10 Leading Courses & Training Programmes For Cloud Computing In India

comments

Vishal Chawla is a senior tech journalist at Analytics India Magazine and writes about AI, data analytics, cybersecurity, blockchain and startup ecosystem. Vishal also hosts AIM's video podcast called Simulated Reality- featuring tech leaders, AI experts, and innovative startups of India. Reach out at vishal.chawla@analyticsindiamag.com

More:
Why Indian IT Professionals Are Looking To Upskill Themselves In Cloud Computing - Analytics India Magazine