Wiring the Quantum Computer of the Future: Researchers from Japan and Australia propose a novel 2D design – QS WOW News

The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges. Efficient quantum computing is expected to enable advancements that are impossible with classical computers. A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and the University of Technology, Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry, and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when the error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons, or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect, making their construction a significant engineering challenge. https://youtu.be/14a__swsYSU

The team of scientists led by Prof Jaw-Shen Tsai has proposed a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.

The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system. The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. The results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other, thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to build large-scale fault-tolerant quantum computers, the findings of this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states. The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.

Continued here:
Wiring the Quantum Computer of the Future: Researchers from Japan and Australia propose a novel 2D design - QS WOW News

Enterprise Quantum Computing Market is Projected to Grow Massively in Near Future with Profiling Eminent Players- Intel Corporation, QRA Corp, D-Wave…

New Study Industrial Forecasts on Enterprise Quantum Computing Market 2020-2026: Enterprise Quantum Computing Market report provides in-depth review of the Expansion Drivers, Potential Challenges, Distinctive Trends, and Opportunities for market participants equip readers to totally comprehend the landscape of the Enterprise Quantum Computing market. Major prime key manufactures enclosed within the report alongside Market Share, Stock Determinations and Figures, Sales, Capacity, Production, Price, Cost, Revenue. The main objective of the Enterprise Quantum Computing industry report is to Supply Key Insights on Competition Positioning, Current Trends, Market Potential, Growth Rates, and Alternative Relevant Statistics.

TheMajorPlayers Covered in this Report: Intel Corporation, QRA Corp, D-Wave Systems, Computing, Cambridge Quantum, QC Ware, QxBranch, Rigetti, IBM Corporation, Quantum Circuits, Google, Microsoft Corporation, Atos SE, Cisco Systems & More.

To get holistic SAMPLE of the report, please click:https://www.reportsmonitor.com/request_sample/905067

The global Enterprise Quantum Computing market is brilliantly shed light upon in this report which takes into account some of the most decisive and crucial aspects anticipated to influence growth in the near future. With important factors impacting market growth taken into consideration, the analysts authoring the report have painted a clear picture of how the demand for Enterprise Quantum Computing Driver could increase during the course of the forecast period. Readers of the report are expected to receive useful guidelines on how to make your companys presence known in the market, thereby increasing its share in the coming years.

Regional Glimpses:The report shed light onthe manufacturing processes, cost structures, and guidelinesand regulations. The regions targeted areEurope, United States, Central & South America, Southeast Asia, Japan, China, and Indiawith their export/import, supply and demand trendswith cost, revenue, and gross margin.The Enterprise Quantum Computing Market is analyzed on the basis of the pricing of the products, the dynamics of demand and supply, total volume produced, and the revenue produced by the products. The manufacturing is studied with respect to various contributors such as manufacturing plant distribution, industry production, capacity, research, and development.

To get this report at a profitable rate @https://www.reportsmonitor.com/check_discount/905067

Major points of the Global Enterprise Quantum Computing Market:

1. The market summary for the global Enterprise Quantum Computing market is provided in context to region, share and market size.2. Innovative strategies used by key players in the market.3. Other focus points in the Global Enterprise Quantum Computing Market report are upcoming opportunities, growth drivers, limiting factors, restrainers, challenges, technical advancements, flourishing segments and other major market trends.4. The comprehensive study is carried by driving market projections and forecast for the important market segments and sub-segments throughout the forecast time period 2020-2026.5. The data has been categorized ans summarized on the basis of regions, companies, types and applications of the product.6. The report has studied developments such as expansions, agreements, latest product launches and mergers in this market.

Reasons to buy the report:

The report would help new entrants as well as established players in the Enterprise Quantum Computing hose market in the following ways:

1. This report segments the Enterprise Quantum Computing market holistically and provides the nearest approximation of the overall, as well as segment-based, market size across different industry, materials, media, and regions.2. The report would support stakeholders in understanding the pulse of the market and present information on key drivers, constraints, challenges, and opportunities for the growth of the market.3. This report would help stakeholders become fully aware of their competition and gain more insights to enhance their position in the business. The competitive landscape section includes competitor ecosystem, along with the product launches and developments; partnerships, agreement, and contracts; and acquisitions strategies implemented by key players in the market.

View this report with a detailed description and TOC @https://www.reportsmonitor.com/report/905067/Enterprise-Quantum-Computing-Market

Any special requirements about this report, please let us know and we can provide custom report.

Contact UsJay MatthewsDirect: +1 513 549 5911 (U.S.)+44 203 318 2846 (U.K.)Email: sales@reportsmonitor.com

View original post here:
Enterprise Quantum Computing Market is Projected to Grow Massively in Near Future with Profiling Eminent Players- Intel Corporation, QRA Corp, D-Wave...

Deltec Bank, Bahamas – Quantum Computing Will bring Efficiency and Effectiveness and Cost Saving in Baking Sec – marketscreener.com

When you add AI and machine learning capabilities to the mix, we could potentially develop pre-warning systems that detect fraud before it even happens.

As online banking grows it is becoming a hot target for cybercriminals around the world as they become ever more adept at cracking bank security. Now, banks are looking into the technology behind quantum computing as a potential solution to this threat as well as its many other benefits. Currently, the technology is still in development but it is expected to take over from traditional computing in the next five to ten years.

What is quantum computing?

With quantum computing, the amount of processing power available is far larger than even the fastest silicon chips in existence today. Rather than using the traditional 1 and 0 method of binary computer processing, quantum computing uses qubits. Utilizing the theory of quantum superposition, these provide a way of processing 1s and 0s simultaneously, increasing the speed of the computer by several orders of magnitude.

For example, in October 2019, Google's 'Sycamore' quantum computer solved an equation in 200 seconds that would have taken a normal supercomputer 10,000 years to complete. This gives you an idea of the power that we are talking about.

So how does this help the banking sector?

1. Fraud Detection

Fraud is quickly becoming the biggest threat to online banking and data security. Customers need to feel confident that their money and their personal information is kept secure and with data leaks happening more frequently, this problem must be addressed.

Quantum computing offers significant benefits in the fight against fraud, offering enough computing power to automatically and instantly detect patterns that are commonly associated with fraudulent activity. When you add AI and machine learning capabilities to the mix, we could potentially develop pre-warning systems that detect fraud before it even happens.

2. Quantum Cryptography

Cryptography is an area of science that has recently gained popularity. The technology has proven incredibly useful in helping to secure the blockchain networks.

Quantum cryptography takes this security to an entirely new level, particularly when applied to financial data. It provides the ability to store data in a theoretical state of constant flux, making it near impossible for hackers to read or steal.

However, it could also be used to easily crack existing cryptographic security methods. Currently, the strongest 2048-bit encryption would take normal computer ages to break in to, whereas a quantum computer could do it in a matter of seconds.

3. Distributed Keys

Distributed key generation (DKG) is already being used by many online platforms for increased protection against data interception. Now, quantum technology provides a new system known as Measurement-Device Independent Quantum Key Distribution (MKI-QKD) which secures communications to a level that even quantum computers can't hack.

The technology is already being investigated by several financial institutions, notably major Dutch bank ABN-AMRO for their online and mobile banking applications.

4. Trading and Data

Artificial intelligence, machine learning, and big data are all new technologies that are currently being tested enthusiastically by banks. However, one of the biggest pain points with these technologies is the amount of processing power required.

According to Deltec Bank - "Quantum computing could quickly accelerate this research past the testing level and provide instant solutions to many problems currently facing the banking world. Time-consuming activities like mortgage and loan approvals would become instant and high-frequency trading could become automated and near error-proof."

Banks that are looking into quantum

Many major banks around the world are already investigating the potential benefits of quantum computing.

UK banking giant Barclays has worked in conjunction with IBM to develop a proof-of-concept that utilizes quantum computing to settle transactions. When applied to trading, the concept could successfully complete massive amounts of complex trades in seconds.

Major US bank JPMorgan has also expressed an interest in the technology for its security and data processing abilities. The bank has tasked its senior engineer with creating a 'quantum culture' in the business and meeting fortnightly with scientists to explore developments in the field.

Banco Bilbao Vizcaya Argentaria (BBVA) is working with the Spanish National Research Council (CISC) to explore various applications of quantum computing. The team believes the technology could reduce risk and improve customer service.

Quantum Computing though still in an early stage will have a significant impact on the Banking sectors in years to come.

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com. The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients' unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management.

Media Contact

Company Name: Deltec International Group

Contact Person: Media Manager

Email: rtrehan@deltecial.com

Phone: 242 302 4100

Country: Bahamas

Website: https://www.deltecbank.com/

Source: http://www.abnewswire.com

.

Original post:
Deltec Bank, Bahamas - Quantum Computing Will bring Efficiency and Effectiveness and Cost Saving in Baking Sec - marketscreener.com

Quantum Computing Market Segmentation, Application, Technology, Analysis Research Report and Forecast to 2026 – Cole of Duty

1qb Information Technologies

Global Quantum Computing Market Segmentation

This market was divided into types, applications and regions. The growth of each segment provides an accurate calculation and forecast of sales by type and application in terms of volume and value for the period between 2020 and 2026. This analysis can help you develop your business by targeting niche markets. Market share data are available at global and regional levels. The regions covered by the report are North America, Europe, the Asia-Pacific region, the Middle East, and Africa and Latin America. Research analysts understand the competitive forces and provide competitive analysis for each competitor separately.

To get Incredible Discounts on this Premium Report, Click Here @ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=24845&utm_source=COD&utm_medium=002

Quantum Computing Market Region Coverage (Regional Production, Demand & Forecast by Countries etc.):

North America (U.S., Canada, Mexico)

Europe (Germany, U.K., France, Italy, Russia, Spain etc.)

Asia-Pacific (China, India, Japan, Southeast Asia etc.)

South America (Brazil, Argentina etc.)

Middle East & Africa (Saudi Arabia, South Africa etc.)

Some Notable Report Offerings:

-> We will give you an assessment of the extent to which the market acquire commercial characteristics along with examples or instances of information that helps your assessment.

-> We will also support to identify standard/customary terms and conditions such as discounts, warranties, inspection, buyer financing, and acceptance for the Quantum Computing industry.

-> We will further help you in finding any price ranges, pricing issues, and determination of price fluctuation of products in Quantum Computing industry.

-> Furthermore, we will help you to identify any crucial trends to predict Quantum Computing market growth rate up to 2026.

-> Lastly, the analyzed report will predict the general tendency for supply and demand in the Quantum Computing market.

Have Any Query? Ask Our Expert @ https://www.verifiedmarketresearch.com/product/Quantum-Computing-Market/?utm_source=COD&utm_medium=002

Table of Contents:

Study Coverage: It includes study objectives, years considered for the research study, growth rate and Quantum Computing market size of type and application segments, key manufacturers covered, product scope, and highlights of segmental analysis.

Executive Summary: In this section, the report focuses on analysis of macroscopic indicators, market issues, drivers, and trends, competitive landscape, CAGR of the global Quantum Computing market, and global production. Under the global production chapter, the authors of the report have included market pricing and trends, global capacity, global production, and global revenue forecasts.

Quantum Computing Market Size by Manufacturer: Here, the report concentrates on revenue and production shares of manufacturers for all the years of the forecast period. It also focuses on price by manufacturer and expansion plans and mergers and acquisitions of companies.

Production by Region: It shows how the revenue and production in the global market are distributed among different regions. Each regional market is extensively studied here on the basis of import and export, key players, revenue, and production.

About us:

Verified market research partners with the customer and offer an insight into strategic and growth analyzes, Data necessary to achieve corporate goals and objectives. Our core values are trust, integrity and authenticity for our customers.

Analysts with a high level of expertise in data collection and governance use industrial techniques to collect and analyze data in all phases. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise and years of collective experience to produce informative and accurate research reports.

Contact us:

Mr. Edwyne FernandesCall: +1 (650) 781 4080Email: [emailprotected]

Tags: Quantum Computing Market Size, Quantum Computing Market Trends, Quantum Computing Market Growth, Quantum Computing Market Forecast, Quantum Computing Market Analysis

Go here to see the original:
Quantum Computing Market Segmentation, Application, Technology, Analysis Research Report and Forecast to 2026 - Cole of Duty

New way of developing topological superconductivity discovered – Chemie.de

Hybrid material nanowires with pencil-like cross section (A) at low temperatures and finite magnetic field display zero-energy peaks (B) consistent with topological superconductivity as verified by numerical simulations (C).

A pencil shaped semiconductor, measuring only a few hundred nanometers in diameter, is what researches from the Center for Quantum Devices, Niels Bohr Institute, at University of Copenhagen, in collaboration with Microsoft Quantum researchers, have used to uncover a new route to topological superconductivity and Majorana zero modes in a study recently published in Science.

The new route that the researchers discovered uses the phase winding around the circumference of a cylindrical superconductor surrounding a semiconductor, an approach they call "a conceptual breakthrough".

"The result may provide a useful route toward the use of Majorana zero modes as a basis of protected qubits for quantum information. We do not know if these wires themselves will be useful, or if just the ideas will be useful," says Charles Marcus, Villum Kann Rasmussen Professor at the Niels Bohr Institute and Scientific Director of Microsoft Quantum Lab in Copenhagen.

"What we have found appears to be a much easier way of creating Majorana zero modes, where you can switch them on and off, and that can make a huge difference"; says postdoctoral research fellow, Saulius Vaitieknas, who was the lead experimentalist on the study.

The new research merges two already known ideas used in the world of quantum mechanics: Vortex-based topological superconductors and the one-dimensional topological superconductivity in nanowires.

"The significance of this result is that it unifies different approaches to understanding and creating topological superconductivity and Majorana zero modes", says professor Karsten Flensberg, Director of the Center for Quantum Devices.

Looking back in time, the findings can be described as an extension of a 50-year old piece of physics known as the Little-Parks effect. In the Little-Parks effect, a superconductor in the shape of a cylindrical shell adjusts to an external magnetic field, threading the cylinder by jumping to a "vortex state" where the quantum wavefunction around the cylinder carries a twist of its phase.

Charles M. Marcus, Saulius Vaitieknas, and Karsten Flensberg from the Niels Bohr Institute at the Microsoft Quantum Lab in Copenhagen.

What was needed was a special type of material that combined semiconductor nanowires and superconducting aluminum. Those materials were developed in the Center for Quantum Devices in the few years. The particular wires for this study were special in having the superconducting shell fully surround the semiconductor. These were grown by professor Peter Krogstrup, also at the Center for Quantum Devices and Scientific Director of the Microsoft Quantum Materials Lab in Lyngby.

The research is the result of the same basic scientific wondering that through history has led to many great discoveries.

"Our motivation to look at this in the first place was that it seemed interesting and we didn't know what would happen", says Charles Marcus about the experimental discovery, which was confirmed theoretically in the same publication. Nonetheless, the idea may indicate a path forward for quantum computing.

'+_msgObj.message+'

'+_msgObj.message+'

'+_msgObj.message+'

'+_msgObj.message+'

'+_msgObj.message+'

You are currently not logged in to my.chemeurope.com .Your changes will in fact be stored however can be lost at all times.

Originally posted here:
New way of developing topological superconductivity discovered - Chemie.de

Machine Learning as a Service Market Overview, Top Companies, Region, Application and Global Forecast by 2026 – Latest Herald

Xeround

Global Machine Learning as a Service Market Segmentation

This market was divided into types, applications and regions. The growth of each segment provides an accurate calculation and forecast of sales by type and application in terms of volume and value for the period between 2020 and 2026. This analysis can help you develop your business by targeting niche markets. Market share data are available at global and regional levels. The regions covered by the report are North America, Europe, the Asia-Pacific region, the Middle East, and Africa and Latin America. Research analysts understand the competitive forces and provide competitive analysis for each competitor separately.

To get Incredible Discounts on this Premium Report, Click Here @ https://www.marketresearchintellect.com/ask-for-discount/?rid=195381&utm_source=LHN&utm_medium=888

Machine Learning as a Service Market Region Coverage (Regional Production, Demand & Forecast by Countries etc.):

North America (U.S., Canada, Mexico)

Europe (Germany, U.K., France, Italy, Russia, Spain etc.)

Asia-Pacific (China, India, Japan, Southeast Asia etc.)

South America (Brazil, Argentina etc.)

Middle East & Africa (Saudi Arabia, South Africa etc.)

Some Notable Report Offerings:

-> We will give you an assessment of the extent to which the market acquire commercial characteristics along with examples or instances of information that helps your assessment.

-> We will also support to identify standard/customary terms and conditions such as discounts, warranties, inspection, buyer financing, and acceptance for the Machine Learning as a Service industry.

-> We will further help you in finding any price ranges, pricing issues, and determination of price fluctuation of products in Machine Learning as a Service industry.

-> Furthermore, we will help you to identify any crucial trends to predict Machine Learning as a Service market growth rate up to 2026.

-> Lastly, the analyzed report will predict the general tendency for supply and demand in the Machine Learning as a Service market.

Have Any Query? Ask Our Expert@ https://www.marketresearchintellect.com/need-customization/?rid=195381&utm_source=LHN&utm_medium=888

Table of Contents:

Study Coverage: It includes study objectives, years considered for the research study, growth rate and Machine Learning as a Service market size of type and application segments, key manufacturers covered, product scope, and highlights of segmental analysis.

Executive Summary: In this section, the report focuses on analysis of macroscopic indicators, market issues, drivers, and trends, competitive landscape, CAGR of the global Machine Learning as a Service market, and global production. Under the global production chapter, the authors of the report have included market pricing and trends, global capacity, global production, and global revenue forecasts.

Machine Learning as a Service Market Size by Manufacturer: Here, the report concentrates on revenue and production shares of manufacturers for all the years of the forecast period. It also focuses on price by manufacturer and expansion plans and mergers and acquisitions of companies.

Production by Region: It shows how the revenue and production in the global market are distributed among different regions. Each regional market is extensively studied here on the basis of import and export, key players, revenue, and production.

About Us:

Market Research Intellect provides syndicated and customized research reports to clients from various industries and organizations with the aim of delivering functional expertise. We provide reports for all industries including Energy, Technology, Manufacturing and Construction, Chemicals and Materials, Food and Beverage and more. These reports deliver an in-depth study of the market with industry analysis, market value for regions and countries and trends that are pertinent to the industry.

Contact Us:

Mr. Steven Fernandes

Market Research Intellect

New Jersey ( USA )

Tel: +1-650-781-4080

Tags: Machine Learning as a Service Market Size, Machine Learning as a Service Market Growth, Machine Learning as a Service Market Forecast, Machine Learning as a Service Market Analysis

Our Trending Reports

Aerospace and Defense Telemetry Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Aerospace Coatings Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Read the rest here:
Machine Learning as a Service Market Overview, Top Companies, Region, Application and Global Forecast by 2026 - Latest Herald

Facebook, AWS team up to produce open-source PyTorch AI libraries, grad student says he successfully used GPT-2 to write his homework…. – The…

Roundup Hello El Reg readers. If you're stuck inside, and need some AI news to soothe your soul, here's our weekly machine-learning roundup.

Nvidia GTC virtual keynote coming to YouTube: Nvidia cancelled its annual GPU Technology Conference in Silicon Valley in March over the ongoing coronavirus pandemic. The keynote speech was promised to be screened virtually, and then that got canned, too. Now, its back.

CEO Jensen Huang will present his talk on May 14 on YouTube at 0600 PT (1300 UTC). Yes, thats early for people on the US West Coast. And no, Jensen isnt doing it live at that hour: the video is prerecorded.

Still, graphics hardware and AI fans will probably want to keep an eye on the presentation. Huang is expected to unveil specs for a new GPU architecture reportedly named the A100, which is expected to be more powerful than its Tesla V100 chips. Youll be able to watch the keynote when it comes out on Nvidias YouTube channel, here.

Also, Nvidia has partnered up with academics at Kings College London to release MONAI, an open-source AI framework for medical imaging.

The framework packages together tools to help researchers and medical practitioners process image data for computer vision models built with PyTorch. These include things like segmenting features in 3D scans or classifying objects in 2D.

Researchers need a flexible, powerful and composable framework that allows them to do innovative medical AI research, while providing the robustness, testing and documentation necessary for safe hospital deployment, said Jorge Cardoso, chief technology officer of the London Medical Imaging & AI Centre for Value-based Healthcare. Such a tool was missing prior to Project MONAI.

You can play with MONAI on GitHub here, or read about it more here.

New PyTorch libraries for ML production: Speaking of PyTorch, Facebook and AWS have collaborated to release a couple of open-source goodies for deploying machine-learning models.

There are now two new libraries: TorchServe and TorchElastic. TorchServe provides tools to manage and perform inference with PyTorch models. It can be used in any cloud service, and you can find the instructions on how to install and use it here.

TorchElastic allows users to train large models over a cluster of compute nodes with Kubernetes. The distributed training means that even if some servers go down for maintenance or random network issues, the service isnt completely interrupted. It can be used on any cloud provider that supports Kubernetes. You can read how to use the library here.

These libraries enable the community to efficiently productionize AI models at scale and push the state of the art on model exploration as model architectures continue to increase in size and complexity, Facebook said this week.

MIT stops working with blacklisted AI company: MIT has discontinued its five-year research collaboration with iFlyTek, a Chinese AI company the US government flagged as being involved in the ongoing persecution of Uyghur Muslims in China.

Academics at the American university made the decision to cut ties with the controversial startup in February. iFlyTek is among 27 other names that are on the US Bureau of Industry and Securitys Entity List, which forbids American organizations from doing business with without Uncle Sam's permission. Breaking the rules will result in sanctions.

We take very seriously concerns about national security and economic security threats from China and other countries, and human rights issues, Maria Zuber, vice president of research at MIT, said, Wired first reported.

MIT entered a five-year deal with iFlyTek in 2018 to collaborate on AI research focused on human-computer interaction, speech recognition, and computer vision.

The relationship soured when it was revealed iFlyTek was helping the Chinese government build a mass automated voice recognition and monitoring system, according to the non-profit Human Rights Watch. That technology was sold to police bureaus in the provinces of Xinjiang and Anhui, where the majority of the Uyghur population in China resides.

OpenAIs GPT-2 writes university papers: A cheeky masters degree student admitted this week to using OpenAIs giant language model GPT-2 to help write his essays.

The graduate student, named only as Tiago, was interviewed by Futurism. We're told that although he passed his assignments using the machine-learning software, he said the achievement was down to failings within the business school rather than to the prowess of state-of-the-art AI technology.

In other words, his science homework wasn't too rigorously marked in this particular unnamed school, allowing him to successfully pass off machine-generated write-ups of varying quality as his own work and GPT-2's output does vary in quality, depending on how you use it.

You couldnt write an essay on science that could be anywhere near convincing using the methods that I used," he said. "Many of the courses that I take in business school wouldnt make it possible as well.

"However, some particular courses are less information-dense, and so if you can manage to write a few pages with some kind of structure and some kind of argument, you can get through. Its not that great of an achievement, I would say, for GPT-2.

Thanks to the Talk to Transformer tool, anyone can use GPT-2 on a web browser. Tiago would feed opening sentences to the model, and copy and paste the machine-generated responses to put in his essay.

GPT-2 is pretty convincing at first: it has a good grasp of grammar, and there is some level of coherency in its opening paragraphs when responding to a statement or question. Its output quality begins to fall apart, becoming incoherent or absurd, as it rambles in subsequent paragraphs. It also doesnt care about facts, which is why it wont be good as a collaborator for subjects such as history and science.

Sponsored: Practical tips for Office 365 tenant-to-tenant migration

Go here to see the original:
Facebook, AWS team up to produce open-source PyTorch AI libraries, grad student says he successfully used GPT-2 to write his homework.... - The...

How Coronavirus Pandemic Will Impact Machine Learning as a Service Market 2020- Global Leading Players, Industry Updates, Future Growth, Business…

The global Machine Learning as a Service market reached ~US$ xx Mn in 2019and is anticipated grow at a CAGR of xx% over the forecast period 2019-2029. In this Machine Learning as a Service market study, the following years are considered to predict the market footprint:

The business intelligence study of the Machine Learning as a Service market covers the estimation size of the market both in terms of value (Mn/Bn USD) and volume (x units). In a bid to recognize the growth prospects in the Machine Learning as a Service market, the market study has been geographically fragmented into important regions that are progressing faster than the overall market. Each segment of the Machine Learning as a Service market has been individually analyzed on the basis of pricing, distribution, and demand prospect for the Global region.

Request Sample Report @https://www.mrrse.com/sample/9077?source=atm

competition landscape which include competition matrix, market share analysis of major players in the global machine learning as a service market based on their 2016 revenues and profiles of major players. Competition matrix benchmarks leading players on the basis of their capabilities and potential to grow. Factors including market position, offerings and R&D focus are attributed to companys capabilities. Factors including top line growth, market share, segment growth, infrastructure facilities and future outlook are attributed to companys potential to grow. This section also identifies and includes various recent developments carried out by the leading players.

Company profiling includes company overview, major business strategies adopted, SWOT analysis and market revenues for year 2014 to 2016. The key players profiled in the global machine learning as a service market include IBM Corporation, Google Inc., Amazon Web Services, Microsoft Corporation, BigMl Inc., FICO, Yottamine Analytics, Ersatz Labs Inc, Predictron Labs Ltd and H2O.ai. Other players include ForecastThis Inc., Hewlett Packard Enterprise, Datoin, Fuzzy.ai, and Sift Science Inc. among others.

The global machine learning as a service market is segmented as below:

By Deployment Type

By End-use Application

By Geography

Each market player encompassed in the Machine Learning as a Service market study is assessed according to its market share, production footprint, current launches, agreements, ongoing R&D projects, and business tactics. In addition, the Machine Learning as a Service market study scrutinizes the strengths, weaknesses, opportunities and threats (SWOT) analysis.

COVID-19 Impact on Machine Learning as a Service Market

Adapting to the recent novel COVID-19 pandemic, the impact of the COVID-19 pandemic on the global Machine Learning as a Service market is included in the present report. The influence of the novel coronavirus pandemic on the growth of the Machine Learning as a Service market is analyzed and depicted in the report.

Request For Discount On This Report @ https://www.mrrse.com/checkdiscount/9077?source=atm

What insights readers can gather from the Machine Learning as a Service market report?

The Machine Learning as a Service market report answers the following queries:

Buy This Report @ https://www.mrrse.com/checkout/9077?source=atm

Why Choose Machine Learning as a Service Market Report?

Read this article:
How Coronavirus Pandemic Will Impact Machine Learning as a Service Market 2020- Global Leading Players, Industry Updates, Future Growth, Business...

Announcing availability of Inf1 instances in Amazon SageMaker for high performance and cost-effective machine learning inference – idk.dev

Amazon SageMaker is a fully managed service that provides every developer and data scientist with the ability to build, train, and deploy machine learning (ML) models quickly. Tens of thousands of customers, including Intuit, Voodoo, ADP, Cerner, Dow Jones, and Thompson Reuters, use Amazon SageMaker to remove the heavy lifting from each step of the ML process.

When it comes to deploying ML models for real-time prediction, Amazon SageMaker provides you with a large selection of AWS instance types, from small CPU instances to multi-GPU instances. This lets you find the right cost/performance ratio for your prediction infrastructure. Today we announce the availability of Inf1 instances in Amazon SageMaker to deliver high performance, low latency, and cost-effective inference.

The Amazon EC2 Inf1 instances were launched at AWS re:Invent 2019. Inf1 instances are powered by AWS Inferentia, a custom chip built from the ground up by AWS to accelerate machine learning inference workloads. When compared to G4 instances, Inf1 instances offer up to three times the inferencing throughput and up to 45% lower cost per inference.

Inf1 instances are available in multiple sizes, with 1, 4, or 16 AWS Inferentia chips. An AWS Inferentia chip contains four NeuronCores. Each implements a high-performance systolic array matrix multiply engine, which massively speeds up typical deep learning operations such as convolution and transformers. NeuronCores are also equipped with a large on-chip cache, which helps cut down on external memory accesses and saves I/O time in the process.

When several AWS Inferentia chips are available on an Inf1 instance, you can partition a model across them and store it entirely in cache memory. Alternatively, to serve multi-model predictions from a single Inf1 instance, you can partition the NeuronCores of an AWS Inferentia chip across several models.

To run machine learning models on Inf1 instances, you need to compile models to a hardware-optimized representation using the AWS Neuron SDK. Since the launch of Inf1 instances, AWS has released five versions of the AWS Neuron SDK that focused on performance improvements and new features, with plans to add more on a regular cadence. For example, image classification (ResNet-50) performance has improved by more than 2X, from 1100 to 2300 images/sec on a single AWS Inferentia chip. This performance improvement translates to 45% lower cost per inference as compared to G4 instances. Support for object detection models starting with Single Shot Detection (SSD) was also added, with Mask R-CNN coming soon.

Now let us show you how you can easily compile, load and run models on ml.Inf1 instances in Amazon SageMaker.

Compiling and deploying models for Inf1 instances in Amazon SageMaker is straightforward thanks to Amazon SageMaker Neo. The AWS Neuron SDK is integrated with Amazon SageMaker Neo to run your model optimally on Inf1 instances in Amazon SageMaker. You only need to complete the following steps:

In the following example use case, you train a simple TensorFlow image classifier on the MNIST dataset, like in this sample notebook on GitHub. The training code would look something like the following:

To compile the model for an Inf1 instance, you make a single API call and select ml_inf1 as the deployment target. See the following code:

Once the machine learning model has been compiled, you deploy the model on an Inf1 instance in Amazon SageMaker using the optimized estimator from Amazon SageMaker Neo. Under the hood, when creating the inference endpoint, Amazon SageMaker automatically selects a container with the Neo Deep Learning Runtime, a lightweight runtime that will load and invoke the optimized model for inference.

Thats it! After you deploy the model, you can invoke the endpoint and receive predictions in real time with low latency. You can find a full example on Github.

Inf1 instances in Amazon SageMaker are available in four sizes: ml.inf1.xlarge, ml.inf1.2xlarge, ml.inf1.6xlarge, and ml.inf1.24xlarge. Machine learning models developed using TensorFlow and MxNet frameworks can be compiled with Amazon SageMaker Neo to run optimally on Inf1 instances and deployed on Inf1 instances in Amazon SageMaker for real-time inference. You can start using Inf1 instances in Amazon SageMaker today in the US East (N. Virginia) and US West (Oregon) Regions.

Julien Simon is an Artificial Intelligence & Machine Learning Evangelist for EMEA, Julien focuses on helping developers and enterprises bring their ideas to life.

Read the original:
Announcing availability of Inf1 instances in Amazon SageMaker for high performance and cost-effective machine learning inference - idk.dev

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

View post:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight

Wiring the Quantum Computer of the Future: a Novel Simple Build with Existing Technology

The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges

Efficient quantum computing is expected to enable advancements that are impossible with classical computers. Scientists from Japan and Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry,and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons,or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect,making their construction a significant engineering challenge.

A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and University of Technology, Sydney, led by Prof Jaw-Shen Tsai, proposes a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.

The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system.The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. Results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other,thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to buildlarge-scale fault-tolerant quantum computers, the findingsof this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states.The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.

###

ReferenceTitle of original paper: Pseudo-2D superconducting quantum computing circuit for the surface code: the proposal and preliminary tests

Journal:New Journal of Physics

DOI:10.1088/1367-2630/ab7d7d

Tokyo University of Science (TUS) is a well-known and respected university, and the largest science-specialized private research university in Japan, with four campuses in central Tokyo and its suburbs and in Hokkaido. Established in 1881, the university has continually contributed to Japans development in science through inculcating the love for science in researchers, technicians, and educators.

With a mission of Creating science and technology for the harmonious development of nature, human beings, and society, TUS has undertaken a wide range of research from basic to applied science. TUS has embraced a multidisciplinary approach to research and undertaken intensive study in some of todays most vital fields. TUS is a meritocracy where the best in science is recognized and nurtured. It is the only private university in Japan that has produced a Nobel Prize winner and the only private university in Asia to produce Nobel Prize winners within the natural sciences field.

Website:https://www.tus.ac.jp/en/mediarelations/

Dr Jaw-Shen Tsai is currently a Professor at the Tokyo University of Science, Japan. He began research in Physics in 1975 and continues to hold interest in areas such as superconductivity, the Josephson effect, quantum physics, coherence, qubits, and artificial atoms. He has 160+ research publications to his credit and serves as the lead author in this paper. He has also won several awards, including Japans Medal of Honor, the Purple Ribbon Award.

Professor Jaw-Shen Tsai

Department of Physics

Tokyo University of Science

Tsutomu Shimizu

Public Relations Divisions

Tokyo University of Science

Email: mediaoffice@admin.tus.ac.jp

Website: https://www.tus.ac.jp/en/mediarelations/

Share This ArticleDo the sharing thingy

Read the original post:
Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology - Analytics Insight

RMACC’s 10th High Performance Computing Symposium to Be Held Free Online – HPCwire

BOULDER, Colo., April 22, 2020 The Rocky Mountain Advanced Computing Consortium (RMACC) will hold its 10thannual High Performance Computing Symposium as a multi-track on-line version on May 20-21.Registration for the event will be free to all who would like to attend.

The on-line Symposium will include presentations by two keynote speakers and a full slate of tutorial sessions.Another longtime Symposium tradition a poster competition for students to showcase their own research also will be continued. Competition winners will receive an all-expenses paid trip to SC20 in Atlanta.

Major sponsor support is being provided by Intel, Dell and HPE with additional support from ARM, IBM, Lenovo and Silicon Mechanics.

Links to the Symposium registration, its schedule, and how to enter the poster competition can be found atwww.rmacc.org/hpcsymposium.

The Keynote speakers areDr.Nick Bronn, a Research Staff Member in IBMs Experimental Quantum Computing group, andDr. Jason Dexter, a working group coordinator for the groundbreaking black hole imaging studies published by Event Horizon Telescope.

Dr. Bronn serves at IBMs TJ Watson Research Center in Yorktown Heights, NY.He has been responsible for qubit (quantum bits) device design, packaging, and cryogenic measurement, working towards scaling up larger numbers of qubits on a device and integration with novel implementations of microwave and cryogenic hardware.He will speak on the topic,Benchmarking and Enabling Noisy Near-term Quantum Hardware.

Dr.Dexter is a member of the astrophysical and planetary sciences faculty at the University of Colorado Boulder.He will speak on the role of high performance computing in understanding what we see in the first image of a black hole.Dr. Dexter is a member of both the Event Horizon Telescope and VLTI/GRAVITY collaborations, which can now image black holes.

Their appearances along with the many tutorial sessions continue the RMACCs annual tradition of showcasing cutting-edge HPC achievements in both education and industry.

The largest consortium of its kind, the RMACC is a collaboration among 30 academic and government research institutions in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming. The consortiums mission is to facilitate widespread effective use of high performance computing throughout the 9-state intermountain region.

More about the RMACC and its mission can be found at the website:www.rmacc.org.

About RMACC

Primarily a volunteer organization, the RMACC is collaboration among 30 academic and research institutions located in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming.The RMACCs mission is to facilitate widespread effective use of high performance computing throughout this 9-state intermountain region.

Source: RMACC

Here is the original post:
RMACC's 10th High Performance Computing Symposium to Be Held Free Online - HPCwire

Eleven Princeton faculty elected to American Academy of Arts and Sciences – Princeton University

Princeton faculty members Rubn Gallo, M. Zahid Hasan, Amaney Jamal, Ruby Lee, Margaret Martonosi, Tom Muir, Eve Ostriker, Alexander Smits, Leeat Yariv and Muhammad Qasim Zaman have been named members of the American Academy of Arts and Sciences. Visiting faculty member Alondra Nelson also was elected to the academy.

They are among 276 scholars, scientists, artists and leaders in the public, nonprofit and private sectors elected this year in recognition of their contributions to their respective fields.

Gallo is the Walter S. Carpenter, Jr., Professor in Language, Literature, and Civilization of Spain and a professor of Spanish and Portuguese. He joined the Princeton faculty in 2002. His most recent book is Conversacin en Princeton(2017)with Mario Vargas Llosa, who was teaching at Princeton when he received the Nobel Prize in Literature in 2010.

Gallos other books include Prousts LatinAmericans(2014);Freuds Mexico: Into the Wilds of Psychoanalysis(2010); Mexican Modernity: the Avant-Garde and the Technological Revolution(2005); New Tendencies in Mexican Art(2004); andThe Mexico City Reader(2004). He is currently working on Cuba: A New Era, a book about the changes in Cuban culture after the diplomatic thaw with the United States.

Gallo received the Gradiva award for the best book on a psychoanalytic theme and the Modern Language Associations Katherine Singer Kovacs Prize for the best book on a Latin American topic. He is a member of the board of the Sigmund Freud Museum in Vienna, where he also serves as research director.

Photo by

Nick Barberio, Office of Communications

Hasan is the Eugene Higgins Professor of Physics. He studiesfundamental quantum effects in exotic superconductors, topological insulators and quantum magnetsto make new discoveries about the nature of matter, work that may have future applications in areas such asquantum computing. He joined the faculty in 2002and has since led his research team to publish many influential findings.

Last year, Hasans lab led research that discovered that certain classes of crystals with an asymmetry like biological handedness, known as chiral crystals, may harbor electrons that behave in unexpected ways. In 2015, he led a research team that first observed Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers.

In 2013, Hasan was named a fellow of the American Physical Society for the experimental discovery of three-dimensional topological insulators a new kind of quantum matter. In 2009, he received a Sloan Research Fellowship for groundbreaking research.

Photo by Tori Repp/Fotobuddy

Jamal is the Edwards S. Sanford Professor of Politics and director of the Mamdouha S. Bobst Center for Peace and Justice. She has taught at Princeton since 2003. Her current research focuses on the drivers of political behavior in the Arab world, Muslim immigration to the U.S. and Europe, and the effect of inequality and poverty on political outcomes.

Jamal also directs the Workshop on Arab Political Development and the Bobst-AUB Collaborative Initiative. She is also principal investigator for the Arab Barometer project, which measures public opinion in the Arab world. She is the former President of the Association of Middle East Womens Studies.

Her books include Barriers to Democracy (2007), which won the 2008 APSA Best Book Award in comparative democratization, and Of Empires and Citizens, which was published by Princeton University Press (2012). She is co-editor of Race and Arab Americans Before and After 9/11: From Invisible Citizens to Visible Subjects (2007) and Citizenship and Crisis: Arab Detroit after 9/11 (2009).

Photo by Tori Repp/Fotobuddy

Lee is the Forrest G. Hamrick Professor in Engineering and professor of electrical engineering. She is an associated faculty member in computer science. Lee joined the Princeton faculty in 1998.Her work at Princeton explores how the security and performance of computing systems can be significantly and simultaneously improved by hardware architecture. Her designs of secure processor architectures have strongly influenced industry security offerings and also inspired new generations of academic researchers in hardware security, side-channel attacks and defenses, secure processors and caches, and enhanced cloud computing and smartphone security.

Her research lies at the intersection of computer architecture, cybersecurity and, more recently, the branch of artificial intelligence known as deep learning.

Lee spent 17 years designing computers at Hewlett-Packard, and was a chief architect there before coming to Princeton. Among many achievements, Lee is known in the computer industry for her design of the HP Precision Architecture (HPPA or PA-RISC) that powered HPs commercial and technical computer product families for several decades, and was widely regarded as introducing key forward-looking features. In the '90s she spearheaded the development of microprocessor instructions for accelerating multimedia, which enabled video and audio streaming, leading to ubiquitous digital media.Lee is a fellow into the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers.

Margaret Martonosi, the Hugh Trumbull Adams 35 Professor of Computer Science, specializes in computer architecture and mobile computing with an emphasis on power efficiency. She was one of the architects of the Wattch power modeling infrastructure, a tool that was among the first to allow computer scientists to incorporate power consumption into early-stage computer systems design. Her work helped demonstrate that power needs can help dictate the design of computing systems. More recently, Martonosis work has also focused on architecture and compiler issues in quantum computing.

She currently serves as head of the National Science Foundations Directorate for Computer and Information Science and Engineering, one of seven top-level divisions within the NSF. From 2017 until February 2020, she directed Princetons Keller Center for Innovation in Engineering Education, a center focused on enabling students across the University to realize their aspirations for addressing societal problems. She is an inventor who holds seven U.S. patents and has co-authored two technical reference books on power-aware computer architecture. In 2018, she was one of 13 co-authors of a National Academies consensus study report on progress and challenges in quantum computing.

Martonosi is a fellow of the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers IEEE). Among other honors, she has received a Jefferson Science Fellowship, the IEEE Technical Achievement Award, and the ACM SIGARCH Alan D. Berenbaum Distinguished Service Award. She joined the Princeton faculty in 1994.

Muir is the Van Zandt Williams, Jr. Class of 65 Professor of Chemistry and chair of the chemistry department. He joined Princeton in 2011 and is also an associated faculty member in molecular biology.

He leads research in investigating the physiochemical basis of protein function in complex systems of biomedical interest. By combining tools of organic chemistry, biochemistry, biophysics and cell biology, his lab has developed a suite of new technologies that provide fundamental insight into how proteins work. The chemistry-driven approaches pioneered by Muirs lab are now widely used by chemical biologists around the world.

Muir has published over 150 scientific articles and has won a number of honors for his research.He received a MERIT Award from the National Institutes of Health and is a fellow of American Association for the Advancement of Science and the Royal Society of Edinburgh.

Nelson is the Harold F. Linder Chair in the School of Social Science at the Institute for Advanced Study and a visiting lecturer with the rank of professor in sociology at Princeton. She is president of the Social Science Research Council and is one of the country's foremost thinkers in the fields of science, technology, social inequalityand race. Her groundbreaking books include "The Social Life of DNA: Race, Reparations, and Reconciliation after the Genome" (2016) and "Body and Soul: The Black Panther Party and the Fight Against Medical Discrimination" (2011).Her other books include"Genetics and the Unsettled Past: The Collision of DNA, Race, and History" (with Keith Wailoo of Princeton and Catherine Lee) and"Technicolor: Race, Technology, and Everyday Life" (with Thuy Linh Tu). In 2002 she edited "Afrofuturism," a special issue of Social Text.

Nelson's writings and commentary also have reached the broader public through a variety of outlets. She has contributed to national policy discussions on inequality and the implications of new technology on society.

She is an elected fellow of the American Academy of Political and Social Science, the Hastings Centerand the Sociological Research Association. She serves on several advisory boards, including the Andrew. W. Mellon Foundation and the American Association for the Advancement of Science.

Ostriker, professor of astrophysical sciences, studies the universe. Her research is in the area of theoretical and computational astrophysics, and the tools she uses are powerful supercomputers and algorithms capable of simulating the birth, life, death and reincarnation of stars in their galactic homes. Ostriker and her fellow researchers build computer models using fundamental physical laws ones that govern gravity, fluid dynamics and electromagnetic radiation to follow the evolution of conditions found in deep space.

Ostriker, who came to Princeton in 2012, and her team have explored the formation of superbubbles, giant fronts of hot gas that billow out from a cluster of supernova explosions. More recently, she and her colleagues turned their focus toward interstellar clouds.

The research team uses computing resources through the Princeton Institute for Computational Science and Engineering and its TIGER and Perseus research computing clusters, as well as supercomputers administered through NASA. In 2017, Ostriker received a Simons Investigator Award.

Photo by

Nick Donnoli, Office of Communications

Smits is the Eugene Higgins Professor of Mechanical and Aerospace Engineering, Emeritus. His research spans the field of fluid mechanics, including fundamental turbulence, supersonic and hypersonic flows, bio-inspired flows, sports aerodynamics, and novel energy-harvesting concepts.

He joined the Princeton faculty in 1981 and transferred to emeritus status in 2018. Smits served as chair of the Department of Mechanical and Aerospace Engineering for 13 years and was director of the Gas Dynamics Laboratory on the Forrestal Campus for 33 years. During that time, he received several teaching awards, including the Presidents Award for Distinguished Teaching.

Smits has written more than 240 articles and three books, and edited seven volumes. He was awarded seven patents and helped found three companies. He is a member of the National Academy of Engineering and a fellow of the American Physical Society, the American Institute of Aeronautics and Astronautics, the American Society of Mechanical Engineers, the American Association for the Advancement of Science, and the Australasian Fluid Mechanics Society.

Yariv is the Uwe Reinhardt Professor of Economics. An expert in applied theory and experimental economics, her research interests concentrate on game theory, political economy, psychology and economics. She joined the faculty in 2018. Yariv also is director of the Princeton Experimental Laboratory for the Social Sciences.

She is a member of several professional organizations and is lead editor of American Economic Journal: Microeconomics, a research associate with the Political Economy Program of the National Bureau of Economic Research, and a research fellow with the Industrial Organization Programme of the Centre for Economic Policy Research.

She is also a fellow of the Econometric Society and the Society for the Advancement of Economic Theory, and has received numerous grants for researchand awards for her many publications.

Zaman, who joined the Princeton faculty in 2006, is the Robert H. Niehaus 77 Professor of Near Eastern Studies and Religion and chair of the Department of Near Eastern Studies.

He has written on the relationship between religious and political institutions in medieval and modern Islam, on social and legal thought in the modern Muslim world, on institutions and traditions of learning in Islam, and on the flow of ideas between South Asia and the Arab Middle East. He is the author of Religion and Politics under the Early Abbasids (1997), The Ulama in Contemporary Islam: Custodians of Change (2002), Ashraf Ali Thanawi: Islam in Modern South Asia (2008), Modern Islamic Thought in a Radical Age: Religious Authority and Internal Criticism (2012), and Islam in Pakistan: A History (2018). With Robert W. Hefner, he is also the co-editor of Schooling Islam: The Culture and Politics of Modern Muslim Education (2007); with Roxanne L. Euben, of Princeton Readings in Islamist Thought (2009); and, as associate editor, with Gerhard Bowering et al., of the Princeton Encyclopedia of Islamic Political Thought (2013). Among his current projects is a book on South Asia and the wider Muslim world in the 18th and 19th centuries.

In 2017, Zaman received Princetons Graduate Mentoring Award. In 2009, he received a Guggenheim Fellowship.

The mission of the academy: Founded in 1780, the American Academy of Arts and Sciences honors excellence and convenes leaders from every field of human endeavor to examine new ideas, address issues of importance to the nation and the world, and work together to cultivate every art and science which may tend to advance the interest, honor, dignity, and happiness of a free, independent, and virtuous people.

View post:
Eleven Princeton faculty elected to American Academy of Arts and Sciences - Princeton University

Advanced Encryption Standard (AES): What It Is and How It Works – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Understanding advanced encryption standard on basic level doesnt require a higher degree in computer science or Matrix-level consciousness lets break AES encryption down into laymans terms

Hey, all. We know of security of information to be a hot topic since, well, forever. We entrust our personal and sensitive information to lots of major entities and still have problems with data breaches, data leaks, etc. Some of this happens because of security protocols in networking, or bad practices of authentication management but, really, there are many ways that data breaches can occur. However, the actual process of decrypting a ciphertext without a key is far more difficult. For that, we can thank the encrypting algorithms like the popular advanced encryption standard and the secure keys that scramble our data into indecipherable gibberish.

Lets look into how AES works and different applications for it. Well be getting a little into some Matrix-based math so, grab your red pills and see how far this rabbit hole goes.

Lets hash it out.

You may have heard of advanced encryption standard, or AES for short but may not know the answer to the question what is AES? Here are four things you need to know about AES:

The National Institute of Standards and Technology (NIST) established AES as an encryption standard nearly 20 years ago to replace the aging data encryption standard (DES). After all, AES encryption keys can go up to 256 bits, whereas DES stopped at just 56 bits. NIST could have chosen a cipher that offered greater security, but the tradeoff would have required greater overhead that wouldnt be practical. So, they went with one that had great all-around performance and security.

AESs results are so successful that many entities and agencies have approved it and utilize it for encrypting sensitive information. The National Security Agency (NSA), as well as other governmental bodies, utilize AES encryption and keys to protect classified or other sensitive information. Furthermore, AES is often included in commercial based products, including but limited to:

Although it wouldnt literally take forever, it would take far longer than any of our lifetimes to crack an AES 256-bit encryption key using modern computing technology. This is from a brute force standpoint, as in trying every combination until we hear the click/unlocking sound. Certain protections are put in place to prevent stuff from like this happening quickly, such as a limit on password attempts before a lockdown, which may or may not include a time lapse, to occur before trying again. When we are dealing with computation in milliseconds, waiting 20 minutes to try another five times would seriously add to the time taken to crack a key.

Just how long would it take? We are venturing into a thousand monkeys working on a thousand typewriters to write A Tale of Two Cities territory. The possible combinations for AES 256-bit encryption is 2256. Even if a computer can do multiple quadrillions of instructions per second, then we are still in that eagles-wings-eroding-Mount-Everest time frame.

Needless to say, its waaaaaaaaaaaaaaaaaaay (theres not enough memory on our computers to support the number of a letters that I want to convey) longer than our current universe has been in existence. And thats just for a 16-byte block of data. So, as you can see, brute forcing AES even if it is 128 bits AES is futile.

That would likely change, though, once quantum computing becomes a little more mainstream, available, and effective. Quantum computing is expected to break AES encryption and require other methods to protect our data but thats still a ways down the road.

Manage Digital Certificates like a Boss

14 Certificate Management Best Practices to keep your organization running, secure and fully-compliant.

To better understand what AES is, you need to understand how it works. But in order to see how the advanced encryption standard actually works, however, we first need to look at how this is set up and the rules concerning the process based on the users selection of encryption strength. Typically, when we discuss using higher bit levels of security, were looking at things that are more secure and more difficult to break or hack. While the data blocks are broken up into 128 bits, the key size have a few varying lengths: 128 bits, 196 bits, and 256 bits. What does this mean? Lets back it up for a second here.

We know that encryption typically deals in the scrambling of information into something unreadable and an associated key to decrypt the scramble. AES scramble procedures use four scrambling operations in rounds, meaning that it will perform the operations, and then repeat the process based off of the previous rounds results X number of times. Simplistically, if we put in X and get out Y, that would be one round. We would then put Y through the paces and get out Z for round 2. Rinse and repeat until we have completed the specified number of rounds.

The AES key size, specified above, will determine the number of rounds that the procedure will execute. For example:

As mentioned, each round has four operations.

So, youve arrived this far. Now, you may be asking: why, oh why, didnt I take the blue pill?

Before we get to the operational parts of advanced encryption standard, lets look at how the data is structured. What we mean is that the data that the operations are performed upon is not left-to-right sequential as we normally think of it. Its stacked in a 44 matrix of 128 bits (16 bytes) per block in an array thats known as a state. A state looks something like this:

So, if your message was blue pill or red, it would look something like this:

So, just to be clear, this is just a 16-byte block so, this means that every group of 16 bytes in a file are arranged in such a fashion. At this point, the systematic scramble begins through the application of each AES encryption operation.

As mentioned earlier, once we have our data arrangement, there are certain linked operations that will perform the scramble on each state. The purpose here is to convert the plaintext data into ciphertext through the use of a secret key.

The four types of AES operations as follows (note: well get into the order of the operations in the next section):

As mentioned earlier, the key size determines the number of rounds of scrambling that will be performed. AES encryption uses the Rjindael Key Schedule, which derives the subkeys from the main key to perform the Key Expansion.

The AddRoundKey operation takes the current state of the data and executes the XOR Boolean operation against the current round subkey. XOR means Exclusively Or, which will yield a result of true if the inputs differ (e.g. one input must be 1 and the other input must be 0 to be true). There will be a unique subkey per round, plus one more (which will run at the end).

The SubBytes operation, which stands for substitute bytes, will take the 16-byte block and run it through an S-Box (substitution box) to produce an alternate value. Simply put, the operation will take a value and then replace it by spitting out another value.

The actual S-Box operation is a complicated process, but just know that its nearly impossible to decipher with conventional computing. Coupled with the rest of AES operations, it will do its job to effectively scramble and obfuscate the source data. The S in the white box in the image above represents the complex lookup table for the S-Box.

The ShiftRows operation is a little more straightforward and is easier to understand. Based off the arrangement of the data, the idea of ShiftRows is to move the positions of the data in their respective rows with wrapping. Remember, the data is arranged in a stacked arrangement and not left to right like most of us are used to reading. The image provided helps to visualize this operation.

The first row goes unchanged. The second row shifts the bytes to the left by one position with row wrap around. The third row shifts the bytes one position beyond that, moving the byte to the left by a total of two positions with row wrap around. Likewise, this means that the fourth row shifts the bytes to the left by a total of three positions with row wrap around.

The MixColumns operation, in a nutshell, is a linear transformation of the columns of the dataset. It uses matrix multiplication and bitwise XOR addition to output the results. The column data, which can be represented as a 41 matrix, will be multiplied against a 44 matrix in a format called the Gallois field, and set as an inverse of input and output. That will look something like the following:

As you can see, there are four bytes in that are ran against a 44 matrix. In this case, matrix multiplication has each input byte affecting each output byte and, obviously, yields the same size.

Now that we have a decent understanding of the different operations utilized to scramble our data via AES encryption, we can look at the order in which these operations execute. It will be as such:

Note: The MixColumns operation is not in the final round. Without getting into the actual math of this, theres no additional benefit to performing this operation. In fact, doing so would simply make the decryption process a bit more taxing in terms of overhead.

If we consider the number of rounds and the operations per round that are involved, by the end of it, you should have a nice scrambled block. And that is only a 16-byte block. Consider how much information that equates to in the big picture. Its miniscule when compared to todays file/packet sizes! So, if each 16-byte block has seemingly no discernable pattern at least, any pattern that can be deciphered in a timely manner Id say AES has done its job.

We know the advanced encryption standard algorithm itself is quite effective, but its level of effectiveness depends on how its implemented. Unlike the brute force attacks mentioned above, effective attacks are typically launched on the implementation and not on the algorithm itself. This can be equated to attacking users as in phishing attacks versus attacking the technology behind the service/function that may be hard to breach. These can be considered side-channel attacks where the attacks are being carried out on other aspects of the entire process and not the focal point of the security implementation.

While I always advocate going with a reasonable/effective security option, a lot of AES encryption is happening without you even knowing it. Its locking down spots of the computing world that would otherwise be wide open. In other words, there would be many more opportunities for hackers to capture data if advanced encryption standard wasnt implemented at all. We just need to know how to identify the open holes and figure out how to plug them. Some may be able to use AES and others may need another protocol or process.

Appreciate the encryption implementations we have, use the best ones when needed, and happy scrutinizing!

Continue reading here:
Advanced Encryption Standard (AES): What It Is and How It Works - Hashed Out by The SSL Store - Hashed Out by The SSL Store

Carole Carson: Adventures in Aging Seven myths about getting older – The Union of Grass Valley

How many of these myths do you accept as reality?

Myth 1: When it comes down to it, aging is just another disease, asserts professor David Sinclair, PhD, a Harvard professor.

He is convinced that aging, like obesity, is a pathological condition that scientists will eradicate.

Reality: If aging is a disease, it must be highly contagious because all my patients get it, says Dr. Todd Bouchier, a Grass Valley physician. And everyone over the age of 65 has an advanced case.

Support Local JournalismDonate

Humor aside, Dr. Bouchier continues, aging is not a disease. Over time, mountains crumble, barns collapse, and cells degenerate. Aging is a fact of nature. Scientists get excited about the possibility of escaping the preprogrammed aspects of cellular aging. No doubt, well make gains and eventually live longer but wont eliminate aging. Making the final years as meaningful as possible is the goal.

Myth 2: All seniors are alike and are best described as sexless, toothless, prune juice-drinking dribblers who watch daytime television and shuffle like Tim Conway.

Reality: People live more diverse lives over time. People in their 20s are more alike than folks in their 80s. We even age differently. Four distinct ageotypes metabolic, immune, hepatic (liver), and nephrotic (kidney) determine how and where in the body biologic aging occurs.

As for sex, studies show that seniors enjoy sex and variations of sexual activity beyond middle age. Moreover, the need for intimacy touching, hugging, or holding hands is timeless.

Myth 3: Old timers are a drain on society, sucking up resources the younger folks need. The fewer seniors in a community, the healthier it is. The coronavirus can thin the herd.

Reality: Over 1,200 nonprofit and 501(c) organizations operate in Nevada County, enriching our community in immeasurable ways. Funding and volunteer support (estimated at 10,000 hours annually) rely heavily on seniors for these civic and social activities.

Plus, increasing numbers of seniors work. And even those who arent on a payroll still work as grandparents and caregivers.

As for welfare, older people have emerged as the wealthiest segment of our population.

Myth 4: Seniors dont need or buy much, hence, commercials focus on young people, except for depressing pharmaceutical ads.

Reality: The 65-and-older population is the mother of all untapped markets, according to Barrons. In 2015, the spending of Americans ages 50 and up accounted for nearly $8 trillion worth of dollars spent. By 2030, the 55-and-older population will have accounted for half of all domestic consumer spending growth.

And even when household income for older people is at or below the median, they have as much or more disposable income as young people with the same income.

Myth 5: You cant teach an old dog new tricks. Technology is wasted on seniors. Humans are born with a finite number of brain cells that die off with aging.

Reality: Learning patterns may change and the speed of learning may diminish, but the basic capacity to learn is retained. As for technology, in 2000, 14% of those aged 65 and older were internet users; now 73% are.

Moreover, through the process of neurogenesis, brain cells adapt and reconnect even regrow and replenish. Thanks to brain plasticity, we old dogs can teach young dogs some new tricks!

Myth 6: To be old is to be irritable and grumpy. Depression is inevitable given the declining trajectory of deteriorating mental and physical health.

Reality: Depression is not a normal part of aging but rather an illness requiring treatment. The course of depression in the elderly is identical to that of younger persons, and the response to treatment appears as positive as that of people in other life stages.

Myth 7: Senior moments signal the onset of dementia, a disease no one escapes if they live long enough. The lights are still on, but the voltage is low.

Reality: Forgetfulness occurs at all ages, but were more inclined to notice as we age. The good news is that the rate of dementia is declining and occurring at older and older ages. Only 5% of people over age 65 have dementia. In addition, some memory loss is caused by medications and medical conditions unrelated to aging.

The best news is that aging and dementia are not inextricably linked. Evidence is growing that regular exercise, healthful eating, and mentally challenging activities can preserve cognitive functions independent of age.

Accepting these myths holds us back. It cuts us off from opportunities that are jumping up and down in front of us seeking to get our attention. Knowing the truth, on the other hand, sets us free to explore our options while we celebrate the simple joy of being alive.

Next Month: Your body over time

Carole Carson, Nevada City, is an author, former AARP website contributor, and leader of the 1994 Nevada County Meltdown. Contact: carolecarson41@gmail.com.

Read more from the original source:
Carole Carson: Adventures in Aging Seven myths about getting older - The Union of Grass Valley

Microsoft Office 365: How these Azure machine-learning services will make you more productive and efficient – TechRepublic

Office can now suggest better phrases in Word or entire replies in Outlook, design your PowerPoint slides, and coach you on presenting them. Microsoft built those features with Azure Machine Learning and big models - while keeping your Office 365 data private.

The Microsoft Office clients have been getting smarter for several years: the first version of Editor arrived in Word in 2016, based on Bing's machine learning, and it's now been extended to include the promised Ideas feature with extra capabilities. More and more of the new Office features in the various Microsoft 365 subscriptions are underpinned by machine learning.

You get the basic spelling and grammar checking in any version of Word. But if you have a subscription, Word, Outlook and a new Microsoft Editor browser extension will be able to warn you if you're phrasing something badly, using gendered idioms so common that you may not notice who they exclude, hewing so closely to the way your research sources phrased something that you need to either write it in your own words or enter a citation, or just not sticking to your chosen punctuation rules.

SEE:Choosing your Windows 7 exit strategy: Four options(TechRepublic Premium)

Word can use the real-world number comparisons that Bing has had for a while to make large numbers more comprehensible. It can also translate the acronyms you use inside your organization -- and distinguish them from what someone in another industry would mean by them. It can even recognise that those few words in bold are a heading and ask if you want to switch to a heading style so they show up in the table of contents.

Outlook on iOS uses machine learning to turn the timestamp on an email to a friendlier 'half an hour ago' when you have it read out your messages. Mobile and web Outlook use machine learning and natural-language processing to suggest three quick replies for some messages, which might include scheduling a meeting.

Excel has the same natural-language queries for spreadsheets as Power BI, letting you ask questions about your data. PowerPoint Designer can automatically crop pictures, put them in the right place on the slide and suggest a layout and design; it uses machine learning for text and slide structure analysis, image categorisation, recommending content to include and ranking the layout suggestions it makes. The Presenter Coach tells you if you're slouching, talking in a monotone or staring down at your screen all the time while you're talking, using machine learning to analyse your voice and posture from your webcam.

How PowerPoint Designer uses AML (Azure Machine Learning).

Image: Microsoft

Many of these features are built using the Azure Machine Learning service, Erez Barak, partner group program manager for AI Platform Management, told TechRepublic. At the other extreme, some call the pre-built Azure Cognitive Services APIs for things like speech recognition in the presentation coach, as well as captioning PowerPoint presentations in real-time and live translation into 60-plus languages (and those APIs are themselves built using AML).

Other features are based on customising pre-trained models like Turing Neural Language Generation, a seventeen-billion parameter deep-learning language model that can answer questions, complete sentences and summarize text -- useful for suggesting alternative phrases in Editor or email replies in Outlook. "We use those models in Office after applying some transfer learning to customise them," Barak explained. "We leverage a lot of data, not directly but by the transfer learning we do; that's based on big data to give us a strong natural-language understanding base. For everything we do in Office requires that context; we try to leverage the data we have from big models -- from the Turing model especially given its size and its leadership position in the market -- in order to solve for specific Office problems."

AML is a machine-learning platform for both Microsoft product teams and customers to build intelligent features that can plug into business processes. It provides automated pipelines that take large amounts of data stored in Azure Data Lake, merge and pre-process the raw data, and feed them into distributed training running in parallel across multiple VMs and GPUs. The machine-learning version of the automated deployment common in DevOps is known as MLOps. Office machine-learning models are often built using frameworks like PyTorch or TensorFlow; the PowerPoint team uses a lot of Python and Jupiter notebooks.

The Office data scientists experiment with multiple different models and variations; the best model then gets stored back into Azure Data Lake and downloaded into AML using the ONNX runtime (open-sourced by Microsoft and Facebook) to run in production without having to be rebuilt. "Packaging the models in the ONNX runtime, especially for PowerPoint Designer, helps us to normalise the models, which is great for MLOps; as you tie these into pipelines, the more normalised assets you have, the easier, simpler and more productive that process becomes," said Barak.

ONNX also helps with performance when it comes to running the models in Office, especially for Designer. "If you think about the number of inference calls or scoring calls happening, performance is key: every small percentage and sub-percentage point matters," Barak pointed out.

A tool like Designer that's suggesting background images and videos to use as content needs a lot of compute and GPU to be fast enough. Some of the Turing models are so large that they run on the FPGA-powered Brainwave hardware inside Azure because otherwise they'd be too slow for workloads like answering questions in Bing searches. Office uses the AML compute layer for training and production which, Barak said, "provides normalised access to different types of compute, different types of machines, and also provides a normalised view into the performance of those machines".

"Office's training needs are pretty much bleeding edge: think long-running, GPU-powered, high-bandwidth training jobs that could run for days, sometimes for weeks, across multiple cores, and require a high level of visibility into the end process as well as a high level of reliability," Barak explained. "We leverage a lot of high-performing GPUs for both training the base models and transfer learning." Although the size of training data varies between the scenarios, Barak estimates that fine-tuning the Turing base model with six months of data would use 30-50TB of data (on top of the data used to train the original model).

Acronyms accesses your Office 365 data, because it needs to know which acronyms your organisation uses.

Image: Mary Branscombe/TechRepublic

The data used to train Editor's rewrite suggestions includes documents written by people with dyslexia, and many of the Office AI features use anonymised usage data from Office 365 usage. Acronyms is one of the few features that specifically uses your own Office 365 data, because it needs to find out which acronyms your organisation uses, but that isn't shared with any other Office users. Microsoft also uses public data for many features rather than trying to mine that from private Office documents. The similarity checker uses Bing data, and Editor's sentence rewrite uses public data like Wikipedia as well as public news data to train on.

As the home of so many documents, Office 365 has a wealth of data, but it also has strong compliance policies and processes that Microsoft's data scientists must follow. Those policies change over time as laws change or Office gets accredited to new standards -- "think of it as a moving target of policies and commitments Office has made in the past and will continue to make," Barak suggested. "In order for us to leverage a subset of the Office data in machine learning, naturally, we adhere to all those compliance promises."

LEARN MORE:Office 365 Consumer pricing and features

But models like those used in Presentation Designer need frequent retraining (at least every month) to deal with new data, such as which of the millions of slide designs it suggests get accepted and are retained in presentations. That data is anonymised before it's used for training, and the training is automated with AML pipelines. But it's important to score retrained models consistently with existing models so you can tell when there's an improvement, or if an experiment didn't pan out, so data scientists need repeated access to data.

"People continuously use that, so we continuously have new data around people's preferences and choices, and we want to continuously retrain. We can't have a system that needs to be adjusted over and over again, especially in the world of compliance. We need to have a system that's automatable. That's reproducible -- and frankly, easy enough for those users to use," Barak said.

"They're using AML Data Sets, which allow them to access this data while using the right policies and guard rails, so they're not creating copies of the data -- which is a key piece of keeping the compliance and trust promise we make to customers. Think of them as pointers and views into subsets of the data that data scientists want to use for machine learning."It's not just about access; it's about repeatable access, when the data scientists say 'let's bring in that bigger model, let's do some transfer learning using the data'. It's very dynamic: there's new data because there's more activity or more people [using it]. Then the big models get refreshed on a regular basis. We don't just have one version of the Turing model and then we're done with it; we have continuous versions of that model which we want to put in the hands of data scientists with an end-to-end lifecycle."

Those data sets can be shared without the risk of losing track of the data, which means other data scientists can run experiments on the same data sets. This makes it easier for them to get started developing a new machine-learning model.

Getting AML right for Microsoft product teams also helps enterprises who want to use AML for their own systems. "If we nail the likes and complexities of Office, we enable them to use machine learning in multiple business processes," Barak said. "And at the same time we learn a lot about automation and requirements around compliance that also very much applies to a lot of our third-party customers."

Be your company's Microsoft insider by reading these Windows and Office tips, tricks, and cheat sheets. Delivered Mondays and Wednesdays

Read more:
Microsoft Office 365: How these Azure machine-learning services will make you more productive and efficient - TechRepublic

Apple is on a hiring freeze … except for its Hardware, Machine Learning and AI teams – Thinknum Media

Word in the tech community is that Apple ($NASDAQ:AAPL) employees are begnning to report hiring freezes for certain groups within the company. But other reports are that hiring is continuing at the Cupertino tech giant. In fact, we've reported on the former.

It turns out that both reports are correct. For some divisions, like Marketing and Corporate Functions, openings have been reduced. But for others, like Hardware and Machine Learning, openings and subsequent hiring appear to be as brisk as ever.

To be clear, overall, job listings at Apple have been cut back.

As recently as mid-March, Apple job listings were nearing the 6,000 mark, which would have been the company's most prolific hiring spree in history. But in late March, it became clear that no one would be going into the office any time soon, and openings quickly began disappearing from Apple's recruitment site. As of this week, openings at Apple are down to 5,240, signaling a decrease in hiring of about 13%.

But not all divisions are stalling their job listings. NeitherApple's "Hardware" or"Machine Learning and AI" groups show a decline in job listings of note.

Hardware openings are flat at worst. Today's 1,570 openings isn't significantly different than a high of 1,600 in March.

Apple's "Machine Learning and AI" group remains as healthy as ever when it comes to new listings being posted to the company's careers sites. As of this week, the team has 334 openings. Last month, that number was 300, an 11% increase in hiring activity.

However, other groups at Apple have seen significant decreases in job listings, including "Software and Services", "Marketing", and "Corporate Functions".

Apple's "Software and Services" team saw a siginificant drop in openings, particularly on April 10, when around 110 openings were cut from the company's recruiting website overnight. Since mid-March, openings on the team have fallen by about 12%.

Between April 14 and April 23, the number of listings for Apple's "Marketing" team dropped by 84. In late March, Apple was seeking 311 people for its Marketing team. Since then, openings have fallen by 36% for the team.

"Corporate Functions" jobs at Apple, which include everything from HR to Finance and Legal, have also seen a steep decline in recent weeks. In late March, Apple listed more than 300 openings for the team. As of this week, it has just around 200 openings, a roughly 1/3 hiring freeze.

So is Apple in the middle of a hiring freeze? Some parts of the company appear frozen. Others appear as hot as ever. Given the in-person nature of Marketing and Corporate Functions jobs, it's not surprising that the company would tap the breaks on interviewing for such positions. On the other hand, engineers working on hardware and machine learning can be remote interviewed and onboarded with equipment delivery.

So, yes, and yes. Apple is, and is not, in the middle of a hiring freeze.

Thinknum tracks companies using the information they post online - jobs, social and web traffic, product sales and app ratings - andcreates data sets that measure factors like hiring, revenue and foot traffic. Data sets may not be fully comprehensive (they only account for what is available on the web), but they can be used to gauge performance factors like staffing and sales.

See original here:
Apple is on a hiring freeze ... except for its Hardware, Machine Learning and AI teams - Thinknum Media

IBM’s The Weather Channel app using machine learning to forecast allergy hotspots – TechRepublic

The Weather Channel is now using artificial intelligence and weather data to help people make better decisions about going outdoors based on the likelihood of suffering from allergy symptoms.

Amid the COVID-19 pandemic, most people are taking precautionary measures in an effort to ward off coronavirus, which is highly communicable and dangerous. It's no surprise that we gasp at every sneeze, cough, or even sniffle, from others and ourselves. Allergy sufferers may find themselves apologizing awkwardly, quickly indicating they don't have COVID-19, but have allergies, which are often treated with sleep-inducing antihistamines that cloud critical thinking.

The most common culprits and indicators to predict symptomsragweed, grass, and tree pollen readingsare often inconsistently tracked across the country. But artificial intelligence (AI) innovation from IBM's The Weather Channel is coming to the rescue of those roughly 50 million Americans that suffer from allergies.

The Weather Channel's new tool shows a 15-day allergy forecast based on ML.

Image: Teena Maddox/TechRepublic

IBM's The Weather Channel is now using machine learning (ML) to forecast allergy symptoms. IBM data scientists developed a new tool on The Weather Channel app and weather.com, "Allergy Insights with Watson" to predict your risk of allergy symptoms.

Weather can also drive allergy behaviors. "As we began building this allergy model, machine learning helped us teach our models to use weather data to predict symptoms," said Misha Sulpovar, product leader, consumer AI and ML, IBM Watson media and weather. Sulpovar's role is focused on using machine learning and blockchain to develop innovative and intuitive new experiences for the users of the Weather Channel's digital properties, specifically, weather.com and The Weather Channel smart phone apps.

SEE: IBM's The Weather Channel launches coronavirus map and app to track COVID-19 infections (TechRepublic)

Any allergy sufferer will tell you it can be absolutely miserable. "If you're an allergy sufferer, you understand that knowing in advance when your symptom risk might change can help anyone plan ahead and take action before symptoms may flare up," Sulpovar said. "This allergy risk prediction model is much more predictive around users' symptoms than other allergy trackers you are used to, which mostly depend on pollenan imperfect factor."

Sulpovar said the project has been in development for about a year, and said, "We included the tool within The Weather Channel app and weather.com because digital users come to us for local weather-related information," and not only to check weather forecasts, "but also for details on lifestyle impacts of weather on things like running, flu, and allergy."

He added, "Knowing how patients feel helps improve the model. IBM MarketScan (research database) is anonymized data from doctor visits of 100 million patients."

Daily pollen counts are also available on The Weather Channel app.

Image: Teena Maddox/TechRepublic

"A lot of what drives allergies are environmental factors like humidity, wind, and thunderstorms, as well as when specific plants in specific areas create pollen," Sulpovar said. "Plants have predictable behaviorfor example, the birch tree requires high humidity for birch pollen to burst and create allergens. To know when that will happen in different locations for all different species of trees, grasses, and weeds is huge, and machine learning is a huge help to pull it together and predict the underlying conditions that cause allergens and symptoms. The model will select the best indicators for your ZIP code and be a better determinant of atmospheric behavior."

"Allergy Insights with Watson" anticipates allergy symptoms up to 15 days in advance. AI, Watson, and its open multi-cloud platform help predict and shape future outcomes, automate complex processes, and optimize workers' time. IBM's The Weather Channel and weather.com are using this machine learning Watson to alleviate some of the problems wrought by allergens.

Sulpovar said, "Watson is IBM's suite of enterprise-ready AI services, applications, and tooling. Watson helps unlock value from data in new ways, at scale."

Data scientists have discovered a more accurate representation of allergy conditions. "IBM Watson machine learning trained the model to combine multiple weather attributes with environmental data and anonymized health data to assess when the allergy symptom risk is high, Sulpovar explained. "The model more accurately reflects the impact of allergens on people across the country in their day-to-day lives."

The model is challenged by changing conditions and the impact of climate change, but there has been a 25% to 50% increase in better decision making, based on allergy symptoms.

It may surprise long-time allergy sufferers who often cite pollen as the cause of allergies that "We found pollen is not a good predictor of allergy risk alone and that pollen sources are unreliable and spotty and cover only a small subset of species," Sulpovar explained. "Pollen levels are measured by humans in specific locations, but sometimes those measurements are few and far between, or not updated often. Our team found that using AI and weather data instead of just pollen data resulted in a 25-50% increase in making better decisions based on allergy symptoms."

Available on The Weather Channel app for iOS and Android, you can also find the tool online atwww.weather.com. Users of the tool will be given an accurate forecast, be alerted to flare-ups, and be provided with practical tips to reduce seasonal allergies.

This story was updated on April 23, 2020 to correct the spelling of Misha Sulpovar's name.

If you can only read one tech story a day, this is it. Delivered Weekdays

Image: Getty Images/iStockphoto

Read the original here:
IBM's The Weather Channel app using machine learning to forecast allergy hotspots - TechRepublic

The industries that can’t rely on machine learning – The Urban Twist

Ever since we started relying on machines and automation, people have been worried about the future of work and, specifically, whether robots will take over their jobs. And it seems this worry is becoming increasingly justified, as an estimated 40% of jobs could be replaced by robots for automated tasks by 2035. There is even a website dedicated to workers worried about whether they could eventually be replaced by robots.

While machines and artificial intelligence are becoming more complex and, therefore, more able to replace humans for menial tasks, that doesnt necessarily apply to a wide number of industries. Here, well go through the sectors that continue to require the human touch.

Despite scientists best efforts, the language and translation industry cannot be replaced by machines. Currently, automatic translation programmes are being developed with deep learning, a form of artificial intelligence which allows the computer to identify and correct its own mistakes through prolonged use and understanding. However, this still isnt enough to guarantee a correct translation, as deep learning requires external factors, like language itself, to remain the same over time. As we know, language is constantly developing, often with changes so subtle, you cant tell its happening. For a machine to be able to accurately translate texts or speech, it would need to be constantly updated with every new modification, across all languages.

Machines are also less able to pick up on the nuances found in speech or text. Things like sarcasm, jokes, or pop culture references are not easily translated, as the new audience may not understand them. Translating idioms is a particularly common example of this, as these phrases are generally unique to their dialect. In the UK, for example, the phrase its raining cats and dogs means its raining heavily. You would not want this translated on a literal level. As London Translations state in an article on the importance of using professionals for financial text translation, literal translations are technically correct, but read awkwardly and can be difficult to comprehend due to poor knowledge of the source language. Needless to say, these issues would be totally unacceptable in a document as important as a financial report.

Translating with accuracy not only requires fluency in both languages, but also a complete understanding of cultural differences and how they can be compared. Machines are simply not able to naturally make these connections without having the information already inputted by a person.

Finding the perfect candidate for a role can get stressful, especially if you have a pool of excellent potential employees to choose from. However, there are now algorithms that recruiters can use to help speed the process up and, theoretically, pick the most suitable person for the job. The technology is being praised for its ability to remove discrimination, as it simply examines raw data, and thus omits any sense of natural prejudice. It can also work to speed up the hiring process, as a computer can quickly sift through applicants and present the most relevant ones, saving someone the job of having to manually read through every application before making a decision.

However, in practice, its not that simple. Recruiting the right candidate should be based on more than qualifications and experience. Personality, attitude, and cultural fit should also be considered when recruiters are finding a candidate, none of which can be picked up on by machines.

One way of minimising this risk could be to introduce the algorithm at an earlier stage, through targeted ads or to help sift through initial applications. This allows recruiters to look at relevant candidates, rather than those that wouldnt have passed the initial screening anyway. However, this could conversely work to introduce bias to the recruitment process. The Harvard Business Review found that the algorithm effectively shapes the pool of candidates, giving a selection of applications that are all similar, fitting the mould that the computer is looking for. The study found that targeted ads on social media for a cashier role were shown to 85% of women, while cab driver ads were shown to an audience that was around 75% black. This happened as the algorithm reproduced bias from the real world, without human intervention. Having people physically checking the applications can serve to prevent this bias, introducing a more conscious effort to carefully screen each candidate on their own merits.

More people than ever before are meeting their partners online, according to a study published by Stanford University. And while a matchmaking algorithm sounds like a dream for singletons, it doesnt mean that they are able to effectively set you up with your life partner. As these algorithms are actually the intellectual property of each app, Dr Samantha Joel, assistant professor at London, Canadas Western University, created her own app with colleagues. Volunteers were asked to complete a questionnaire about themselves and ideal partners, much like typical dating websites would. After answering over 100 questions, the data was analysed and volunteers were set up on four-minute-long speed dates with potential candidates. Joel then asked the volunteers about their feelings towards any of their dates.

These results then identified the three things needed to predict romantic interest: actor desire (how much people liked their dates), partner desire (how much people were liked by dates), and attractiveness. The researchers were able to subtract attractiveness from the scores of romantic interest, giving a measure of compatibility. However, while the algorithm could accurately predict actor and partner desire, it failed on compatibility. Instead, it may be worth sticking to the second most common way of meeting a partner through a mutual friend. Your friends will be able to make educated decisions about relationships, as they have a deeper understanding of preferences and compatibility in a way that a machine simply cant replicate.

Author Bio: Syna Smith is a chief editor of Business usa today. She has also good experience in digital marketing.

More:
The industries that can't rely on machine learning - The Urban Twist

Artificial Intelligence & Advanced Machine learning Market is expected to grow at a CAGR of 37.95% from 2020-2026 – Latest Herald

According toBlueWeave Consulting, The globalArtificial Intelligence market&Advanced Machinehas reached USD 29.8 Billion in 2019 and projected to reach USD 281.24 Billion by 2026 and anticipated to grow with CAGR of 37.95% during the forecast period from 2020-2026, owing to increasing overall global investment in Artificial Intelligence Technology.

Request to get the report sample pages at : https://www.blueweaveconsulting.com/artificial-intelligence-and-advanced-machine-learning-market-bwc19415/report-sample

Artificial Intelligence (AI) is a computer science algorithm and analytics-driven approach to replicate human intelligence in a machine and Machine learning (ML) is an enhanced application of artificial intelligence, which allows software applications to predict the resulted accurately. The development of powerful and affordable cloud computing infrastructure is having a substantial impact on the growth potential of artificial intelligence and advanced machine learning market. In addition, diversifying application areas of the technology, as well as a growing level of customer satisfaction by users of AI & ML services and products is another factor that is currently driving the Artificial Intelligence & Advanced Machine Learning market. Moreover, in the coming years, applications of machine learning in various industry verticals is expected to rise exponentially. Proliferation in data generation is another major driving factor for the AI & Advanced ML market. As natural learning develops, artificial intelligence and advanced machine learning technology are paving the way for effective marketing, content creation, and consumer interactions.

In the organization size segment, large enterprises segment is estimated to have the largest market share and the SMEs segment is estimated to grow at the highest CAGR over the forecast period of 2026. The rapidly developing and highly active SMEs have raised the adoption of artificial intelligence and machine learning solutions globally, as a result of the increasing digitization and raised the cyber risks to critical business information and data. Large enterprises have been heavily adopting artificial intelligence and machine learning to extract the required information from large amounts of data and forecast the outcome of various problems.

Predictive analysis and machine learning and is rapidly used in retail, finance, and healthcare. The trend is estimated to continue as major technology companies are investing resources in the development of AI and ML. Due to the large cost-saving, effort-saving, and the reliable benefits of AI automation, machine learning is anticipated to drive the global artificial intelligence and Advanced machine learning market during the forecast period of 2026.

Digitalization has become a vital driver of artificial intelligence and advanced machine learning market across the region. Digitalization is increasingly propelling everything from hotel bookings, transport to healthcare in many economies around the globe. Digitalization had led to rising in the volume of data generated by business processes. Moreover, business developers or crucial executives are opting for solutions that let them act as data modelers and provide them an adaptive semantic model. With the help of artificial intelligence and Advanced machine learning business users are able to modify dashboards and reports as well as help users filter or develop reports based on their key indicators.

Geographically, the Global Artificial Intelligence & Advanced Machine Learning market is bifurcated into North America, Asia Pacific, Europe, Middle East, Africa & Latin America. The North America is dominating the market due to the developed economies of the US and Canada, there is a high focus on innovations obtained from R&D. North America has rapidly changed, and the most competitive global market in the world. The Asia-pacific region is estimated to be the fastest-growing region in the global AI & Advanced ML market. The rising awareness for business productivity, supplemented with competently designed machine learning solutions offered by vendors present in the Asia-pacific region, has led Asia-pacific to become a highly potential market.

Request to get the report description pages at :https://www.blueweaveconsulting.com/artificial-intelligence-and-advanced-machine-learning-market-bwc19415/

Artificial Intelligence & Advanced Machine Learning Market: Competitive Landscape

The major market players in the Artificial Intelligence & Advanced Machine Learning market are ICarbonX, TIBCO Software Inc., SAP SE, Fractal Analytics Inc., Next IT, Iflexion, Icreon, Prisma Labs, AIBrain, Oracle Corporation, Quadratyx, NVIDIA, Inbenta, Numenta, Intel, Domino Data Lab, Inc., Neoteric, UruIT, Waverley Software, and Other Prominent Players are expanding their presence in the market by implementing various innovations and technology.

Read more here:
Artificial Intelligence & Advanced Machine learning Market is expected to grow at a CAGR of 37.95% from 2020-2026 - Latest Herald