2020: The year of seeing clearly on AI and machine learning – ZDNet

Tom Foremski

Late last year, I complained toRichard Socher, chief scientist at Salesforce and head of its AI projects, about the term "artificial intelligence" and that we should use more accurate terms such as machine learning or smart machine systems, because "AI" creates unreasonably high expectations when the vast majority of applications are essentially extremely specialized machine learning systems that do specific tasks -- such as image analysis -- very well but do nothing else.

Socher said that when he was a post-graduate it rankled him also, and he preferred other descriptions such as statistical machine learning. He agrees that the "AI" systems that we talk about today are very limited in scope and misidentified, but these days he thinks of AI as being "Aspirational Intelligence." He likes the potential for the technology even if it isn't true today.

I like Socher's designation of AI as Aspirational Intelligence but I'd prefer not to further confuse the public, politicians and even philosophers about what AI is today: It is nothing more than software in a box -- a smart machine system that has no human qualities or understanding of what it does. It's a specialized machine that is nothing to do with systems that these days are called Artificial General Intelligence (AGI).

Before ML systems co-opted it, the term AI was used to describe what AGI is used to describe today: computer systems that try to mimic humans, their rational and logical thinking, and their understanding of language and cultural meanings to eventually become some sort of digital superhuman, which is incredibly wise and always able to make the right decisions.

There has been a lot of progress in developing ML systems but very little progress on AGI. Yet the advances in ML are being attributed to advances in AGI. And that leads to confusion and misunderstanding of these technologies.

Machine learning systems unlike AGI, do not try to mimic human thinking -- they use very different methods to train themselves on large amounts of specialist data and then apply their training to the task at hand. In many cases, ML systems make decisions without any explanation and it's difficult to determine the value of their black box decisions. But if those results are presented as artificial intelligence then they get far higher respect from people than they likely deserve.

For example, when ML systems are being used in applications such as recommending prison sentences but are described as artificial intelligence systems -- they gain higher regard from the people using them. It implies that the system is smarter than any judge. But if the term machine learning is used it would underline that these are fallible machines and allow people to treat the results with some skepticism in key applications.

Even if we do develop future advanced AGI systems we should continue to encourage skepticism and we should lower our expectations for their abilities to augment human decision making. It is difficult enough to find and apply human intelligence effectively -- how will artificial intelligence be any easier to identify and apply? Dumb and dumber do not add up to a genius. You cannot aggregate IQ.

As things stand today, the mislabeled AI systems are being discussed as if they were well on their way of jumping from highly specialized non-human tasks to becoming full AGI systems that can mimic human thinking and logic. This has resulted in warnings from billionaires and philosophers that those future AI systems will likely kill us all -- as if a sentient AI would conclude that genocide is rational and logical. It certainly might appear to be a winning strategy if the AI system was trained on human behavior across recorded history but that would never happen.

There is no rational logic for genocide. Future AI systems would be designed to love humanity and be programmed to protect and avoid human injury. They would likely operate very much in the vein of Richard Brautigan's 1967 poemAll Watched Over By Machines Of Loving Grace--the last stanza:

I like to think(it has to be!)of a cybernetic ecologywhere we are free of our laborsand joined back to nature,returned to our mammalbrothers and sisters,and all watched overby machines of loving grace.

Let us not fear AI systems and in 2020, let's be clear and call them machine learning systems -- because words matter.

Original post:
2020: The year of seeing clearly on AI and machine learning - ZDNet

Essential AI & Machine Learning Certification Training Bundle Is Available For A Limited Time 93% Discount Offer Avail Now – Wccftech

Machine learning and AI are the future of technology. If you wish to become part of the world of technology, this is the place to begin. The world is becoming more dependent on technology every day and it wouldnt hurt to embrace it like it is. If you resist it, you will just be obsolete and will have trouble surviving. Wccftech is offering an amazing discount offer on the Essential AI & Machine Learning Certification Training Bundle. The offer will expire in less than a week, so avail it right away.

The bundle includes 4 extensive courses on NLP, Computer Vision, Data visualization and Machine Learning. Each course will help you understand the technology world a bit more and you will not regret investing your time and money on this. The courses have been created by experts so, you are in safe hands. Here are highlights of what the Essential AI & Machine Learning Certification Training Bundle has in store for you:

The bundle has been brought to you by GreyCampus. They are known for providing learning solutions to professionals in various fields including project management, data science, big data, quality management and more. They offer different kinds of teaching platforms including e-learning and live-online. All these courses have been specifically designed to meet the markets changing needs.

Original Price Essential AI & Machine Learning Certification Training Bundle: $656Wccftech Discount Price Essential AI & Machine Learning Certification Training Bundle: $39.99

Share Submit

More:
Essential AI & Machine Learning Certification Training Bundle Is Available For A Limited Time 93% Discount Offer Avail Now - Wccftech

Leveraging AI and Machine Learning to Advance Interoperability in Healthcare – – HIT Consultant

(Left- Wilson To, Head of Worldwide Healthcare BD, Amazon Web Services (AWS) & Patrick Combes, Worldwide Technical Leader Healthcare and Life Sciences at Amazon Web Services (AWS)- Right)

Navigating the healthcare system is often a complex journey involving multiple physicians from hospitals, clinics, and general practices. At each junction, healthcare providers collect data that serve as pieces in a patients medical puzzle. When all of that data can be shared at each point, the puzzle is complete and practitioners can better diagnose, care for, and treat that patient. However, a lack of interoperability inhibits the sharing of data across providers, meaning pieces of the puzzle can go unseen and potentially impact patient health.

The Challenge of Achieving Interoperability

True interoperability requires two parts: syntactic and semantic. Syntactic interoperability requires a common structure so that data can be exchanged and interpreted between health information technology (IT) systems, while semantic interoperability requires a common language so that the meaning of data is transferred along with the data itself.This combination supports data fluidity. But for this to work, organizations must look to technologies like artificial intelligence (AI) and machine learning (ML) to apply across that data to shift the industry from a fee-for-service where government agencies reimburse healthcare providers based on the number of services they provide or procedures ordered to a value-based model that puts focus back on the patient.

The industry has started to make significant strides toward reducing barriers to interoperability. For example, industry guidelines and resources like the Fast Healthcare Interoperability Resources (FHIR) have helped to set a standard, but there is still more work to be done. Among the biggest barriers in healthcare right now is the fact there are significant variations in the way data is shared, read, and understood across healthcare systems, which can result in information being siloed and overlooked or misinterpreted.

For example, a doctor may know that a diagnosis of dropsy or edema may be indicative of congestive heart failure, however, a computer alone may not be able to draw that parallel. Without syntactic and semantic interoperability, that diagnosis runs the risk of getting lost in translation when shared digitally with multiple health providers.

Employing AI, ML and Interoperability in Healthcare

Change Healthcare is one organization making strides to enable interoperability and help health organizations achieve this triple aim. Recently, Change Healthcareannounced that it is providing free interoperability services that breakdown information silos to enhance patients access to their medical records and support clinical decisions that influence patients health and wellbeing.

While companies like Change Healthcare are creating services that better allow for interoperability, others like Fred Hutchinson Cancer Research Center and Beth Israel Deaconess Medical Center (BIDMC) are using AI and ML to further break down obstacles to quality care.

For example, Fred Hutch is using ML to help identify patients for clinical trials who may benefit from specific cancer therapies. By using ML to evaluate millions of clinical notes and extract and index medical conditions, medications, and choice of cancer therapeutic options, Fred Hutch reduced the time to process each document from hours, to seconds, meaning they could connect more patients to more potentially life-saving clinical trials.

In addition, BIDMC is using AI and ML to ensure medical forms are completed when scheduling surgeries. By identifying incomplete forms or missing information, BIDMC can prevent delays in surgeries, ultimately enhancing the patient experience, improving hospital operations, and reducing costs.

An Opportunity to Transform The Industry

As technology creates more data across healthcare organizations, AI and ML will be essential to help take that data and create the shared structure and meaning necessary to achieve interoperability.

As an example, Cernera U.S. supplier of health information technology solutionsis deploying interoperability solutions that pull together anonymized patient data into longitudinal records that can be developed along with physician correlations. Coupled with other unstructured data, Cerner uses the data to power machine learning models and algorithms that help with earlier detection of congestive heart failure.

As healthcare organizations take the necessary steps toward syntactic and semantic interoperability, the industry will be able to use data to place a renewed focus on patient care. In practice, Philips HealthSuite digital platform stores and analyses 15 petabytes of patient data from 390 million imaging studies, medical records and patient inputsadding as much as one petabyte of new data each month.

With machine learning applied to this data, the company can identify at-risk patients, deliver definitive diagnoses and develop evidence-based treatment plans to drive meaningful patient results. That orchestration and execution of data is the definition of valuable patient-focused careand the future of what we see for interoperability drive by AI and ML in the United States. With access to the right information at the right time that informs the right care, health practitioners will have access to all pieces of a patients medical puzzleand that will bring meaningful improvement not only in care decisions, but in patients lives.

About Wilson To, Global Healthcare Business Development lead at AWS & Patrick Combes, Global Healthcare IT Lead at AWS

Wilson To is the Head Worldwide Healthcare Business Development at Amazon Web Services (AWS). currently leads business development efforts across the AWS worldwide healthcare practice.To has led teams across startup and corporate environments, receiving international recognition for his work in global health efforts. Wilson joined Amazon Web Services in October 2016 to lead product management and strategic initiatives.

Patrick Combes is the Worldwide Technical Leader for Healthcare Life & Sciences at Amazon (AWS) where he is responsible for AWS world-wide technical strategy in Healthcare and Life Sciences (HCLS). Patrick helps develop and implement the strategic plan to engage customers and partners in the industry and leads the community of technically focused HCLS specialists within AWS wide technical strategy in Healthcare and Life Sciences (HCLS). Patrick helps develop and implement the strategic plan to engage customers and partners in the industry and leads the community of technically focused HCLS specialists within AWS.

Read more:
Leveraging AI and Machine Learning to Advance Interoperability in Healthcare - - HIT Consultant

Seton Hall Announces New Courses in Text Mining and Machine Learning – Seton Hall University News & Events

Professor Manfred Minimair, Data Science, Seton Hall University

As part of its online M.S. in Data Science program, Seton Hall University in South Orange, New Jersey, has announced new courses in Text Mining and Machine Learning.

Seton Hall's master's program in Data Science is the first 100% online program of its kind in New Jersey and one of very few in the nation.

Quickly emerging as a critical field in a variety of industries, data science encompasses activities ranging from collecting raw data and processing and extracting knowledge from that data, to effectively communicating those findings to assist in decision making and implementing solutions. Data scientists have extensive knowledge in the overlapping realms of business needs, domain knowledge, analytics, and software and systems engineering.

"We're in the midst of a pivotal moment in history," said Professor Manfred Minimair, director of Seton Hall's Data Science program. "We've moved from being an agrarian society through to the industrial revolution and now squarely into the age of information," he noted. "The last decade has been witness to a veritable explosion in data informatics. Where once business could only look at dribs and drabs of customer and logistics dataas through a glass darklynow organizations can be easily blinded by the sheer volume of data available at any given moment. Data science gives students the tools necessary to collect and turn those oceans of data into clear and readily actionable information."

These tools will be provided by Seton Hall in new ways this spring, when Text Mining and Machine Learning make their debut.

Text MiningTaught by Professor Nathan Kahl, text mining is the process of extracting high-quality information from text, which is typically done by developing patterns and trends through means such as statistical pattern learning. Professor Nathan Kahl is an Associate Professor in the Department of Mathematics and Computer Science. He has extensive experience in teaching data analytics at Seton Hall University. Some of his recent research lies in the area of network analysis, another important topic which is also taught in the M.S. program.

Professor Kahl notes, "The need for people with these skills in business, industry and government service has never been greater, and our curriculum is specifically designed to prepare our students for these careers." According to EAB (formerly known as the Education Advisory Board), the national growth in demand for data science practitioners over the last two years alone was 252%. According to Glassdoor, the median base salary for these jobs is $108,000.

Machine LearningIn many ways, machine learning represents the next wave in data science. It is the scientific study of algorithms and statistical models that computer systems use to perform a specific task without using explicit instructions, relying on patterns and inference instead. It is seen as a subset of artificial intelligence. The course will be taught by Sophine Clachar, a data engineer with more than 10 years of experience. Her past research has focused on aviation safety and large-scale and complex aviation data repositories at the University of North Dakota. She was also a recipient of the Airport Cooperative Research Program Graduate Research Award, which fostered the development of machine learning algorithms that identify anomalies in aircraft data.

"Machine learning is profoundly changing our society," Professor Clachar remarks. "Software enhanced with artificial intelligence capabilities will benefit humans in many ways, for example, by helping design more efficient treatments for complex diseases and improve flight training to make air travel more secure."

Active Relationships with Google, Facebook, Celgene, Comcast, Chase, B&N and AmazonStudents in the Data Science program, with its strong focus on computer science, statistics and applied mathematics, learn skills in cloud computing technology and Tableau, which allows them to pursue certification in Amazon Web Services and Tableau. The material is continuously updated to deliver the latest skills in artificial intelligence/machine learning for automating data science tasks. Their education is bolstered by real world projects and internships, made possible through the program's active relationships with such leading companies as Google, Facebook, Celgene, Comcast, Chase, Barnes and Noble and Amazon. The program also fosters relationships with businesses and organizations through its advisory board, which includes members from WarnerMedia, Highstep Technologies, Snowflake Computing, Compass and Celgene. As a result, students are immersed in the knowledge and competencies required to become successful data science and analytics professionals.

"Among the members of our Advisory Board are Seton Hall graduates and leaders in the field," said Minimair. "Their expertise at the cutting edge of industry is reflected within our curriculum and coupled with the data science and academic expertise of our professors. That combination will allow our students to flourish in the world of data science and informatics."

Learn more about the M.S. in Data Science at Seton Hall

See the rest here:
Seton Hall Announces New Courses in Text Mining and Machine Learning - Seton Hall University News & Events

Christiana Care offers tips to ‘personalize the black box’ of machine learning – Healthcare IT News

For all the potential benefits of artificial intelligence and machine learning, one of the biggest and, increasingly most publicized challenges with the technology is the potential for algorithmic bias.

But an even more basic challenge for hospitals and health systems looking to deploy AI and ML can be the skepticism from frontline staff a hesitance to use predictive models that, even if they aren't inherently biased, are certainly hard to understand.

At Delaware-based Christiana Care Health System, the past few years have seen efforts to "simplify the model without sacrificing precision," says Dr. Terri Steinberg, its chief health information officer and VP of population health informatics.

"The simpler the model, the more human beings will accept it," said Steinberg, who will talk more about this notion in a March 12 presentation at HIMSS20.

When it comes to pop health programs, the data sets used to drive the analytics matter, she explains. Whether it's EHR data, social determinants of health, claims data or even wearables information, it's key to select the most relevant data sources, use machine learning to segment the population and then, crucially, present those findings to care managers in a way that's understandable and fits their workflow.

At HIMSS20, Steinberg, alongside Health Catalyst Chief Data Scientist Jason Jones, will show how Christiana Care has been working to streamline its machine learning processes, to ensure they're more approachable and thus more liable to be embraced by its care teams.

Dr. Terri Steinberg, Christiana Care Health System

They'll explain how to assign relative value to pop health data and discuss some of the challenges associated with integrating them; they'll show how ML can segment populations and spotlight strategies for using new data sources that will boost the value and utility of predictive models.

"We've been doing this since 2012," said Steinberg. And now we have significant time under our belts, so we wanted to come back to HIMSS and talk about what we were doing in terms of programming for care management and, more important, how we're segmenting our population with machine learning."

"There are a couple of patterns that we've seen repeated across engagements that are a little bit counter to how people typically go about building these models today, which is to sort of throw everything at them and hope for the best," said Jones, of Health Catalyst, Christiana Care's vendor partner.

At Christiana Care, he said, the goal instead has been to "help people understand as much as they would like about how the models are working, so that they will trust and actually use them.

"We've found repeatedly that we can build technically fantastic models that people just don't trust and won't use," he added. "In that case, we might as well not bother in the first place. So we're going to go through and show how it is that we can build models in such a way that they're technically excellent but also well-trusted by the people who are going to use them."

In years past, "when we built the model and put it in front of our care managers and said, 'Here you go, now customize your treatment plans based on the risk score,' what we discovered is that they basically ignored the score and did what they wanted," Steinberg explained.

But by simplifying a given model to the "smallest number of participants and data elements that can be," that enables the development of something "small enough for people to understand the list of components, so that they think that they know why the model has made a specific prediction," she said.

That has more value than many population health professionals realize.

"The goal is to simplify the model as much as you can, so human beings understand the components," said Steinberg.

"People like understanding why a particular individual falls into a risk category," she said. "And then they sometimes would even like to know what the feature is that has resulted in the risk. The take home message is that the more human beings understand what the machine is doing, the more likely they are to trust the machine. We want to personalize the black box."

Steinberg and Jones will talk more about making machine learning meaningful at a HIMSS20 session titled "Machine Learning and Data Selection for Population Health." It's scheduled for Thursday, March 12, from 10-11 a.m. in room W414A.

Continue reading here:
Christiana Care offers tips to 'personalize the black box' of machine learning - Healthcare IT News

What is the role of machine learning in industry? – Engineer Live

In 1950, Alan Turing developed the Turing test to answer the question can machines think? Since then, machine learning has gone from being just a concept, to a process relied on by some of the worlds biggest companies. Here Sophie Hand, UK country manager at industrial parts supplier EU Automation, discusses the applications of the different types of machine learning that exist today.

Machine learning is a subset of artificial intelligence (AI) where computers independently learn to do something they were not explicitly programmed to do. They do this by learning from experience leveraging algorithms and discovering patterns and insights from data. This means machines dont need to be programmed to perform exact tasks on a repetitive basis.

Machine learning is rapidly being adopted across several industries according to Research and Markets, the market is predicted to grow to US$8.81 billion by 2022, at a compound annual growth rate of 44.1 per cent. One of the main reasons for its growing use is that businesses are collecting Big Data, from which they need to obtain valuable insights. Machine learning is an efficient way of making sense of this data, for example the data sensors collect on the condition of machines on the factory floor.

As the market develops and grows, new types of machine learning will emerge and allow new applications to be explored. However, many examples of current machine learning applications fall into two categories; supervised learning and unsupervised learning.

A popular type of machine learning is supervised learning, which is typically used in applications where historical data is used to develop training models predict future events, such as fraudulent credit card transactions. This is a form of machine learning which identifies inputs and outputs and trains algorithms using labelled examples. Supervised learning uses methods like classification, regression, prediction and gradient boosting for pattern recognition. It then uses these patterns to predict the values of the labels on the unlabelled data.

This form of machine learning is currently being used in drug discovery and development with applications including target validation, identification of biomarkers and the analysis of digital pathology data in clinical trials. Using machine learning in this way promotes data-driven decision making and can speed up the drug discovery and development process while improving success rates.

Unlike supervised learning, unsupervised learning works with datasets without historical data. Instead, it explores collected data to find a structure and identify patterns. Unsupervised machine learning is now being used in factories for predictive maintenance purposes. Machines can learn the data and algorithms responsible for causing faults in the system and use this information to identify problems before they arise.

Using machine learning in this way leads to a decrease in unplanned downtime as manufacturers are able to order replacement parts from an automation equipment supplier before a breakdown occurs, saving time and money. According to a survey by Deloitte, using machine learning technologies in the manufacturing sector reduces unplanned machine downtime between 15 and 30 per cent, reducing maintenance costs by 30 per cent.

Its no longer just humans that can think for themselves machines, such as Googles Duplex, are now able to pass the Turing test. Manufacturers can make use of machine learning to improve maintenance processes and enable them to make real-time, intelligent decisions based on data.

The rest is here:
What is the role of machine learning in industry? - Engineer Live

Break into the field of AI and Machine Learning with the help of this training – Boing Boing

It seems like AI is everywhere these days, from the voice recognition software in our personal assistants to the ads that pop up seemingly at just the right time. But believe it or not, the field is still in its infancy.

That means there's no better time to get in on the ground floor. The Essential AI & Machine Learning Certification Training Bundle is a one-course package that can give you a broad overview of AI's many uses in the modern marketplace and how to implement them.

The best place to dive into this four-course master class is with the Artificial Intelligence (AI) & Machine Learning (ML) Foundation Course. This walkthrough gives you all the terms and concepts that underpin the entire science of AI.

Later courses let you get your hands dirty with some coding, as in the data visualization class that focuses on the role of Python in the interpretive side of data analytics. There are also separate courses on computer vision (the programming that lets machines "see" their surroundings) and natural language processing (the science of getting computers to understand speech).

The entire package is now available for Boing Boing readers at 93% off the MSRP.

Former Vice President and current 2020 Democratic presidential hopeful Joe Biden says U.S. Section 230 should be immediately revoked for Facebook and other social media platforms, and that Mark Zuckerberg should be submitted to civil liability.

FBI needs to be able to hack into your iphone, Trumps sham AG William Barr says

Gee, thanks.

The latest iPhone cameras are undeniably impressive, but theyre still no match for a professional camera when it comes to taking clear, wide-angle shots. These six accessories will transform your iPhone into a pro-level camera in seconds, thanks to powerful and easy-to-attach lenses. 1. Lemuro 18MM iPhone Wide Lens MSRP: $100 | Sale Price: $80 []

Few things in life are more universally dreaded than going to the gym, which is unfortunate since a new year usually means making new resolutions to get in shape. Thankfully, this BodyBoss 2.0: Portable Home Gym has everything you need to burn fat and build muscle in the comfort of your own home. With just []

If one of your New Years resolutions is to travel more, you owe it to yourself to learn the language of the place youre visiting. If youre not sure where to start, give these resources a look. From mobile apps to online courses, these products can get you conversant in a new language before you []

Read the original:
Break into the field of AI and Machine Learning with the help of this training - Boing Boing

BlackBerry combines AI and machine learning to create connected fleet security solution – Fleet Owner

Fleet Owner returned to CES, the annual mega technology show in Las Vegas, in search of potential transportation technology that could help fleets of the future. Here are some news and notes from the more than a million square feet of exhibit space. You can read our coverage of other news out of CES here: (DOT on autonomous vehicles,Peterbilt,Kenworth and Dana,Bosch and ZF,BlackBerry, andmore).

Plus.ai announced at CES 2020 it will expand testing of its self-driving trucks to cover all permissible continental states in the U.S. by the end of 2020. This will include closed-course testing and public road testing, with a safety driver and operations specialist onboard to assume manual control if needed.

"We want to build a technology solution that is applicable across different weather, terrains, and driving scenarios, said Shawn Kerrigan, COO and co-founder ofPlus.ai. Testing our trucks readiness means we need to put them through stringent safety tests, on every highway in the country.

Plus.ai has conducted autonomous truck testing in 17 states: Arizona, California, Colorado, Illinois, Indiana, Kansas, Minnesota, Missouri, Nevada, New Mexico, Ohio, Pennsylvania, South Dakota, Texas, Utah, West Virginia and Wyoming.

"Thesmart mobility ecosystem weve established in Ohiois a premier testing ground for autonomous vehicles," said Patrick Smith, interim executive director ofDriveOhio. Ohio is excited to welcome leading autonomous trucking companies like Plus.ai to test at our state-of-the-art facilities and infrastructure.

Plus.ai expects that the new testing sites and states will be selected by the spring and implementation will take place through the rest of the year.

Ryder's outdoor booth at CES 2020 featured a Nikola Two truck.Josh Fisher/Fleet Owner

Ryder System was among the trucking and logistics companies exhibiting at CES this year. And helping the company catch the eye of attendees was the Nikola Two tractor on display at its outdoor booth that focused on the future of transportation logistics and equipment.

Ryder is showing current and potential leasing customers what is available now and around the corner in electric and automated trucks and how they can help increase supply chain efficiency.

Bridgestone made its first appearance at CES, and highlighted its mobility solutions that look toward an autonomous future focused on extended mobility, improved safety and increased efficiency.

The company showed off its future airless tires, smart tire technology and its Webfleet Solutions platform. That platform uses data and analytics to move millions of vehicles as efficiently as possible.

"Bridgestone has a nearly 90-year history of using technology and research to develop advanced products, services and solutions for a world in motion," said TJ Higgins, global chief strategic officer of Bridgestone. As we look to the future, we are combining our core tire expertise with a wide range of digital solutions to deliver connected products and services that promote safe, sustainable mobility and continue contributing to society's advancement."

The company's CES showcase demonstrated how airless tires from Bridgestone combine a tire's tread and wheel into one durable, high-strength structure. This design eliminates the need for tires to be filled and maintained with air.

The company also showed how its digital twin and connected tire technology can be used to generate specific, actionable predictions that can enhance the precision of vehicle safety systems.

The Bosch Virtual Visor uses LCD and AI technology to keep a driver's eyes in the shade.Bosch Global

Bosch unveiled what is called the most drastic improvement to the 100-year-old sun visor.

The Virtual Visor links an LCD panel with a driver or occupant-monitoring camera to track the suns casted shadow on the drivers face.

The system uses artificial intelligence to locate the driver within the image from the driver-facing camera. It also utilizes AI to determine the landmarks on the face including where the eyes, nose and mouth are located so it can identify shadows on the face.

The algorithm analyzes the drivers view, darkening only the section of the display through which light hits the drivers eyes. The rest of the display remains transparent, no longer obscuring a large section of the drivers field of vision.

We discovered early in the development that users adjust their traditional sun visors to always cast a shadow on their own eyes, said Jason Zink, technical expert for Bosch in North America and one of the co-creators of the Virtual Visor. This realization was profound in helping simplify the product concept and fuel the design of the technology.

This use of liquid crystal technology to block a specific light source decreases dangerous sun glare, driver discomfort and accident risk; it also increases driver visibility, comfort and safety.

The World Economic Forum and Deepen AI unveiled Safety Pool, a global incentive-based brokerage of shared driving scenarios and safety data for safe autonomous driving systems.

Aptiv was one of the first publicly announced members of the initiative.

"At Aptiv, we believe that our industry makes progress by sharing, especially when it comes to safety. We are proud to be part of the World Economic Forum's Safety Pool, and we are confident that with continued collaboration, we will deliver the safer and more accessible mobility solutions our communities deserve," said Karl Iagnemma, Aptivs president of autonomous mobility.

Safety Pool is gathering a vast and diverse set of driving scenarios and safety data from the major industry players developing ADAS systems and autonomous driving technology, it was announced at CES 2020. Data will be accessible by the members while an incentive scheme ensures the right value is taken and given by every Safety Pool participant, regardless of their size, level of funding, or years of operations.

According to Deepen, WEF and the first publicly announced Safety Pool pioneering members, sharing this data on such a large scale will generate tremendous positive externalities for the whole industry.

Each company developing ADAS systems and autonomous driving technology will have the chance to tap into a massive, common and shared database of driving scenarios on which to train and validate their machine learning models. In this way, the overall safety of operations will drastically increase, accelerating time to deployment.

See the original post:
BlackBerry combines AI and machine learning to create connected fleet security solution - Fleet Owner

Raleys Drive To Be Different Gets an Assist From Machine Learning – Winsight Grocery Business

Raleys has brought artificial intelligence to pricing not to necessarily to go toe-to-toe with competitors, but to differentiate from them, President and CEO Keith Knopf said.

Speaking in a presentation at the National Retail Federation show in New York, Knopf described how the West Sacramento, Calif.-based food retailer is using machine learning algorithms from partner Eversight to help manage its price perception amid larger, and often cheaper, competitorswhile optimizing revenue by driving unit share growth and margin dollars. That benefit is going toward what he described as a differentiated positioning behind health and wellness.

This is not just about pricing for the sake of pricing. This is pricing within a business strategy to differentiateand afford the investment in price in a way that is both financially sustainable and also relevant to the customer, Knopf said.

Raleyshas been working with Eversight for about four years, and has since invested in the Palo Alto, Calif.-based provider of AI-led pricing and promotion management. Knopf described using insights and recommendations derived from Eversights data crunching to support its merchants, helping to strategically manage the Rubiks Cube of pricing and promoting 40,000 items, each with varying elasticity, in stores with differing customer bases, price zones and competitive characteristics.

Raleys, Knopf said, is high-priced relative to its competitors, a reflection of its sizeand its ambitions. Were a $3 billion to $4 billion retailer competing against companies much larger than us, with much greater purchasing power and so for us, [AI pricing] is about optimization within our brand framework. We aspire to be a differentiated operator with a differentiated customer experience and a differentiated product assortment, which is guided more toward health and wellness. We have strong position in fresh that is evolving through innovation. But we also understand that we are a high-priced, high-cost retailer.

David Moran, Eversights co-founder, was careful to put his companys influence in perspective. Algorithms don't replace merchants or set a strategy, he said, but can support them by bringing new computing power that exceeds the work a merchant could do alone and has allowed for experimentation with pricing strategies across categories.In an example he shared, a mix of price changessome going up, others downhelped to drive overall unit growth and profits in the olive oil category.

The merchants still own the art: They are still the connection between the brand positioning, the price value perception, and they also own the execution, Knopf said. This technology gets us down that road much faster and with greater confidence.

Knopf said he believes that pricing science, in combination with customer relationship management, will eventually trigger big changes in the nature of promotional spending by vendors, with a shift toward so-called below the line programs, such as everyday pricing and personalized pricing, and less above the line mass promotions, which he believes are ultimately ineffective at driving long-term growth.

Every time we promote above the line, and everybody sees what everybody else does, no more units are sold in totality in the marketplace, it's just a matter of whos going to sell this week at what price, Knopf said.I believe that its in in the manufacturers best interest, and the retailers best interest, to make pricing personalized and relevant, and the dollars that are available today will shift from promotions into a more personalized, one-on-one, curated relationship that a vendor, the retailer and the customer will share.

Excerpt from:
Raleys Drive To Be Different Gets an Assist From Machine Learning - Winsight Grocery Business

Doctor’s Hospital focused on incorporation of AI and machine learning – EyeWitness News

NASSAU, BAHAMAS Doctors Hospital has depriortized its medical tourism program and is now more keenly focused on incorporating artificial intelligence and machine learning in healthcare services.

Dr Charles Diggiss, Doctors Hospital Health System president, revealed the shift during a press conference to promote the 2020 Bahamas Business Outlook conference at Baha Mar next Thursday.

When you look at whats happening around us globally with the advances in technology its no surprise that the way companies leverage data becomes a game changer if they are able to leverage the data using artificial intelligence or machine learning, Diggiss said.

In healthcare, what makes it tremendously exciting for us is we are able to sensorize all of the devices in the healthcare space, get much more information, use that information to tell us a lot more about what we should be doing and considering in your diagnosis.

He continued: How can we get information real time that would influence the way we manage your conditions, how can we have on the backend the assimilation of this information so that the best outcome occurs in our patient care environment.

Diggiss noted while the BISX-listed healthcare provider is still involved in medical tourism, that no longer is a primary focus.

We still have a business line of medical tourism but one of the things we do know pretty quickly in Doctors Hospital is to deprioritize if its apparent that that is not a successful ay to go, he said.

We have looked more at taking our specialities up a notch and investing in the technology support of the specialities with the leadership of some significant Bahamian specialists abroad, inviting them to come back home.

He added: We have depriortized medical tourism even though we still have a fairly robust programme going on at our Blake Road facility featuring two lines, a stem cell line a fecal microbiotic line.

They are both doing quite well but we are not putting a lot of effort into that right now compared to the aforementioned.

Go here to read the rest:
Doctor's Hospital focused on incorporation of AI and machine learning - EyeWitness News

Being human in the age of Artificial Intelligence – Deccan Herald

After a while, everything is overhyped and underwhelming. Even Artificial Intelligence has not been able to escape the inevitable reduction that follows such excessive hype. AI is everything and everywhere now and most of us wont even blink if we are toldAI is poweringsomeonestoothbrush. (It probably is).

The phrase is undoubtedly being misused but is the technology too? One thing is certain, whether we like it or not, whether we understand it or not, for good or bad, AI is playing a huge part in our everyday life today huger than we imagine. AI is being employed in health, wellness and warfare; it is scrutinizing you, helping you take better photos, making music, books and even love. (No, really. The first fully robotic sex doll is being created even as you are reading this.)

However, there is a sore lack of understanding of what AI really is, how it is shaping our future and why it is likely to alter our very psyche sooner or later. There is misinformation galore, of course. Either media coverage of AI is exaggerated (as if androids will take over the world tomorrow) or too specific and technical, creating further confusion and fuelling sci-fi-inspired imaginations of computers smarter than human beings.

So what is AI? No, we are not talking dictionary definitions here those you can Google yourself. Neither are we promising to explain everything that will need a book. We are onlyhoping to give you aglimpse into theextraordinary promise and peril of this single transformative technology as Prof Stuart Russell, one of the worlds pre-eminent AI experts, puts it.

Prof Russell has spent decades on AI research and is the author of Artificial Intelligence: A Modern Approach, which is used as a textbook on AI in over 1,400 universities around the world.

Machine learning first

Otherexperts believe our understanding of artificial intelligence should begin with comprehending machine learning, the so-called sub-field of AI butone that actually encompasses pretty much everything that is happening in AI at present.

In its very simplest definition, machine learning is enabling machines to learn on their own. The advantages of thisare easy to see. After a while, you need not tell it what to do it is your workhorse. All you need is to provide it data and it will keep coming up with smarter ways of digesting that data, spotting patterns, creating opportunities in short doing your work better than you perhaps ever could. This is the point where you need to scratch the surface. Scratch and you will stare into a dissolving ethical conundrum about what machines might end up learning. Because, remember they do not (cannot) explain their thinking process. Not yet, at least. Precisely why, the professor has a cautionary take.

The concept of intelligence is central to who we are. After more than 2,000 years of self-examination, we have arrived at a characterization of intelligence that can be boiled down to this: Humans are intelligent to the extent that our actions can be expected to achieve our objectives. Intelligence in machines has been defined in the same way: Machines are intelligent to the extent that their actions can be expected to achieve their objectives.

Whose objectives?

The problem,writes the professor, is in this very definition of machine intelligence. We say that machines are intelligent to the extent that their actions can be expected to achieve their objectives, but we have no reliable way to make sure that their objectives are the same as our objectives. He believes what we should have done all along is to tweak this definition to: Machines are beneficial to the extent that their actions can be expected to achieve our objectives.

The difficulty here is of course that our objectives are in us all eight billion of us and not in the machines. Machines will be uncertain about our objectives; after all we are uncertain about them ourselves but this is a good thing; this is a feature, not a bug. Uncertainty about objectives implies that machines will necessarily defer to humans they will ask permission, they will accept correction and they will allow themselves to be switched off.

Spilling out of the lab

This might mean a complete rethinking and rebuilding of the AI superstructure. Perhaps something that indeed is inevitable if we do not want this big event in human history to be the last, says the prof wryly. As Kai-Fu Lee, another AI researcher, said in an interview a while ago, we are at a moment where the technology is spilling out of the lab and into the world. Time to strap up then!

(With inputs from Human Compatible: AI and the Problem of Control by Stuart Russell, published by Penguin, UK. Extracted with permission.)

Visit link:
Being human in the age of Artificial Intelligence - Deccan Herald

The Quantum Computing Era Is Here. Why It MattersAnd How It May Change Our World. – Forbes

IBM Q System One

Hyper-accurate long-term weather forecasting. Life-saving drugs discovered through deep study of the behavior of complex molecules. New synthetic carbon-capturing materials to help reverse climate change caused by fossil fuels. Stable, long-lasting batteries to power electric vehicles and store green energy for the utility grid.

It may read like an ambitious wish list. But many scientists predict that the emerging era of quantum computing could lead to breakthroughs like these, while also tackling other major problems that are beyond reach of our current computing regime.

Quantum computing is not a new idea. But its only been in recent years that workable technology has begun to catch up to the theory.

IBM in 2016 made a quantum computer available to the public by connecting it to the clouda true turning point in the development of this technology by enabling outside researchers and developers to explore its possibilities. And the industry took a major stride in September 2019 with the opening of IBMs Quantum Computation Center. That fleet of 15 systems includes the most advanced quantum computer yet available for external use.

Scientists are tantalized by the possibilities. One analyst predicted quantum will be as world altering in the 2020s as the smartphone was in the decade just ended.

The Quantum Computation Center offers about 100 IBM clients, academic institutions and more than 200,000 registered users access to this cutting-edge technology through a collaborative effort called the IBM Q Network and the rapidly growing community around Qiskit, IBMs open-source development platform for quantum computing. Through these efforts, IBM and others are exploring the ways quantum computing can address their most complicated problems, while training a workforce to use this technology.

Facilitating education and developing the next-generation workforce is a big focus for IBM. That includes spurring access to Qiskit and educational tools like the Coding With Qiskit video series that has generated more than 1.5 million impressions and over 10,000 hours of content consumed by users.The company has also released an open source textbook written by experts in the field, including several from IBM Research, as well as professors who have used some of the material in their own university courses.

Q Network partners include ExxonMobil, Daimler, JPMorgan Chase, Anthem, Delta Airlines, Los Alamos National Laboratory, Oak Ridge National Laboratory, Georgia Tech University, Keio University, Stanford Universitys Q-Farm program and Mitsubishi Chemical among dozens of others.

Last year IBM announced partnerships with the University of Tokyo and the German research company Fraunhofer-Gesellschaft, which will greatly expand the companys already broad network of quantum researchers globally. The history of computing tells us that creative people around the world will find uses for these systems that no one could have predicted.

Katie Pizzolato

At this stage, its difficult to predict what kind of impact quantum will have on employment or the economy. But the research firm Gartner projects that, by 2023, 20 percent of organizations will be budgeting for QC projects, up from less than 1 percent in 2018.*

How do we get to the quantum future? asks Katie Pizzolato, Director of Applications Research within the IBM Q Network. By building the most advanced quantum systems and a developmental platform and making it available to the world.

How does quantum differ from classical digital computing? Conventional computers use transistors that can only store information in two electrical statesOn or Offwhich binary computer code represents as 1 or 0. These are the binary digits, or bits, of classical computing.

Quantum computing is an altogether different beast. It derives its origins from the field of quantum physics, which emerged in the early 20th century when scientists began studying the behavior of subatomic particles.

What they discovered shocked many of them. Simply put, subatomic particles can exist in two places, or two states, at the same time, defying previously accepted laws of the physical world. The term for this is superposition. Researchers also discovered that particles separated by distances are able to share information instantaneously, faster than the speed of light. This is called entanglement.

If this sounds strange and implausible, thats because it is. Niels Bohr, one of the scientists who pioneered the field of quantum mechanics, quipped that anyone who is not shocked by quantum theory doesn't understand it.

This subatomic reality has profound implications for computing. The binary bits used by conventional computersthose 0s and 1slimit the kind of task classical computers can perform, and the speed at which they can do those tasks.

Qubits are the basis for quantum computers. They transcend this 1 or 0 binary limitation. Unlike bits, qubits can exist in multiple states simultaneously. This gives them the potential to processexponential amounts ofinformation.

A quantum machine with just a couple of qubits can process only about as much information as a classical 512-bit computer. But because of the exponential nature of the platform, the dynamic changes very quickly. Assuming perfect stability, 300 qubits could represent more data values than there are atoms in the observable universe. This opens the opportunity to solve highly complex problems that are well beyond the reach of any classical computer.

Dario Gil

A beauty of quantum computers is that they will offer a more subtle way of thinking about problems that goes beyond binarythat goes beyond simple 0 or 1, Yes or No, True or False, says Dario Gil, the Director of IBM Research. That doesnt mean there wont be specific answers in the end. But quantum computing will make it possible to confront many of the worlds most complex problems that are beyond the ability of classical binary computing to quickly solve.

What can quantum computing do for us?

Quantum computers will be orders of magnitude more powerful than anything we have today. But what problems will they solve? What are scientists doing with them now?

Its generally agreed that most important quantum applications are years away. But researchers say some promising applications stand out:

Climate change

Quantum computing could lead to a novel yet ambitious plan to reverse the negative impacts of climate change, by helping find efficient ways to remove carbon from the atmosphere.

To do that, scientists require a better understanding of the carbon atom and how it interacts with other elements. Researchers need to be able to observe and model the way each carbon atoms eight orbiting electrons might interact with the electrons of an almost infinite variety of other molecules, until researchers find the combination that can best bind the carbon.

Batteries to store more electricity for clean-energy uses

One fundamental building block of our clean-energy future will be batteries. Todays batteries lose power too quickly.They also cant hold enough charge to meet increasing demands. And at times, theyre unstable. Todays most-used battery type, lithium-ion, is dependent on cobalt, a metal whose global supplies are dwindling.

Well need better batteries for applications like powering electric vehicles. Utility companies will need them to store solar and wind energy, for example, for use when the sun isnt shining or the wind isnt blowing.

We need to find a fundamentally different chemistry to create the batteries of the future, Pizzolato says. Quantum computing could let us effectively peer inside the batteries chemical reactions, to better understand the materials and reactions that will give the world those better batteries.

New insights into chemistry

Learning more about chemical reactions on the atomic level could also lead to breakthroughs in pharmaceuticals, or materials like energy-efficient fertilizer (currently a massively energy-intensive endeavor, and a major contributor to carbon emissions).

The catalysts that spark these sorts of discoveries are the essence of nearly all progress in chemistry. But because of the infinitely complex ways in which atoms interact with each other, almost all chemistry breakthroughs have come about through accident, intuition or exhausting numbers of experiments. Quantum computing could make this work faster and more methodical, leading to new discoveries in medicine, energy, materials and other fields.

Portfolio management

Its no surprise that that financial institutions are exploring the use of quantum to balance portfolios and pricing options, the instruments used for hedging risk. Because of the complexity of processing a large number of continually changing variables it often takes a full day to come to a correct price.

Quantum promises to make such calculations in a matter of minutes, meaning these derivatives could be bought and sold in near real time. Some banks, like JPMorgan Chase, are already testing quantum computing for this very purpose.

For consumers, whether saving for a home, nurturing a college-savings plan, or building assets for a secure retirement, the peace-of-mind benefits of lower-risk and higher-profit financial products could be significant.

Encryption

Cryptography is a field that has attracted considerable attention in the quantum conversation. So far, much of the discussion has involved the perceived perils of a new class or code breakers. But the counter argumentnew types of more secure data privacy systemscould prove just as compelling.

Either way, true breakthroughs are probably not coming soon.

The most sophisticated data security software now uses complex algorithms to generate passwords that would take classical computers a long time to break. Quantum threatens to completely overturn this paradigm, making current encryption effectively useless. A quantum computer algorithm created a quarter-century ago, called Shors algorithm, could theoretically crack even the most powerful of todays forms of encryption. But Shors algorithm would require fault-tolerant quantum computers that dont yet exist and might still be many years away.

Still, the possibility that current cybersecurity standards could be made obsolete has drawn the attention of governments. The National Institute of Standards and Technology, for example, has a competition to develop new encryption tools resistant to the potential danger.

What Must Happen To Fulfill Quantums Promise?

Despite the flurry of activity and rapidly growing interest in quantum computing, major breakthroughs with real-world applications are probably years away.

One reason is the fickleness of subatomic matter. Qubits are extremely delicate, and even a small disturbance knocks particles out of quantum state. Thats why quantum computers are kept at temperatures slightly above absolute zero, colder than outer space, since matter becomes effectively more stable the colder it gets. Even at that temperature, qubit particles typically remain in superposition for only fractions of a second.

Figuring out how to keep qubits in a prolonged state of superpostition is a major challenge that scientists still need to overcome.

A next major benchmark, Pizzolato says, will be the successful implementation of logical qubits that can maintain a quantum state longer than is now technologically possible. Logical qubits are necessary for fault-tolerancethe true test of quantum computings utility. Like others at IBM, Pizzolato is reluctant to predict a timeline but says the logical qubit is likely to arrive sometime in the next decade.

Another open question is economic: How will the arrival of the Quantum Age impact the number, categories and quality of jobs in the decades to come? Its difficult to say right now how big an industry quantum computing will eventually be. But currently, a major skills gap has left nearly every quantum organization struggling to find qualified recruits.

The National Quantum Initiative, signed into law in early 2019, is meant to provide federal funds to bridge this skills gap. But practical training of the sort made possible by the IBM Q Network will be crucial to a long-term solution.

While the quantum era may develop slowly, its worth remembering that the Internetor an early version of itwas around for decades before it was established as the truly revolutionary force it would become. But like the Internet, the work researchers are doing now on quantum computing lead to a world we cant now imagine.

Only by doing the hard work on quantum computing that we and our partners around the world are doing now, says Pizzolato, can we hope to solve the big global problems that well be facing together in the years ahead.

*Gartner, Top 10 Strategic Technology Trends for 2019: Quantum Computing, March 2019

Go here to read the rest:
The Quantum Computing Era Is Here. Why It MattersAnd How It May Change Our World. - Forbes

AlphaZero beat humans at Chess and StarCraft, now it’s working with quantum computers – The Next Web

A team of researchers from Aarhus University in Denmark let DeepMinds AlphaZero algorithm loose on a few quantum computing optimization problems and, much to everyones surprise, the AI was able to solve the problems without any outside expert knowledge. Not bad for a machine learning paradigm designed to win at games like Chess and StarCraft.

Youve probably heard of DeepMind and its AI systems. The UK-based Google sister-company is responsible for both AlphaZero and AlphaGo, the systems that beat the worlds most skilled humans at the games of Chess and Go. In essence, what both systems do is try to figure out what the optimal next set of moves is. Where humans can only think so many moves ahead, the AI can look a bit further using optimized search and planning methods.

Related:DeepMinds AlphaZero AI is the new champion in chess, shogi, and Go

When the Aarhus team applied AlphaZeros optimization abilities to a trio of problems associated with optimizing quantum functions an open problem for the quantum computing world they learned that its ability to learn new parameters unsupervised transferred over from games to applications quite well.

Per the study:

AlphaZero employs a deep neural network in conjunction with deep lookahead in a guided tree search, which allows for predictive hidden-variable approximation of the quantum parameter landscape. To emphasize transferability, we apply and benchmark the algorithm on three classes of control problems using only a single common set of algorithmic hyperparameters.

The implications for AlphaZeros mastery over the quantum universe could be huge. Controlling a quantum computer requires an AI solution because operations at the quantum level quickly become incalculable by humans. The AI can find optimum paths between data clusters in order to emerge better solutions in tandem with computer processors. It works a lot like human heuristics, just scaled to the nth degree.

An example of this would be an algorithm that helps a quantum computer sort through near-infinite combinations of molecules to come up with chemical compounds that would be useful in the treatment of certain illnesses. The current paradigm would involve developing an algorithm that relies on human expertise and databases with previous findings to point it in the right direction.

But the kind of problems were looking at quantum computers to solve dont always have a good starting point. Some of these, optimization problems like the Traveling Salesman Problem, need an algorithm thats capable of figuring things out without the need for constant adjustment by developers.

DeepMinds algorithm and AI system may be the solution quantum computings been waiting for. The researchers effectively employ AlphaZero as a Tabula Rasa for quantum optimization: It doesnt necessarily need human expertise to find the optimum solution to a problem at the quantum computing level.

Before we start getting too concerned about unsupervised AI accessing quantum computers, its worth mentioning that so far AlphaZeros just solved a few problems in order to prove a concept. We know the algorithms can handle quantum optimization, now its time to figure out what we can do with it.

The researchers have already received interest from big tech and other academic institutions with queries related to collaborating on future research. Not for nothing, but DeepMinds sister-company Google has a little quantum computing program of its own. Were betting this isnt the last weve heard of AlphaZeros adventures in the quantum computing world.

Read next: Cyberpunk 2077 has been delayed to September (thank goodness)

Read the original here:
AlphaZero beat humans at Chess and StarCraft, now it's working with quantum computers - The Next Web

The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat – ZDNet

Emerging technologies like the Internet of Things, artificial intelligence and quantum computing have the potential to transform human lives, but could also bring unintended consequences in the form of making society more vulnerable to cyberattacks, the World Economic Forum (WEF) has warned.

Now in it's 15th year, the WEFGlobal Risks Report 2020 produced in collaboration with insurance broking and risk management firm Marsh details the biggest threats facing the world over the course of the next year and beyond.

Data breaches and cyberattacks featured in the top five most likely global risks in both 2018 and 2019, but while both still pose significant risks, they're now ranked at sixth and seventh respectively.

"I wouldn't underestimate the importance of technology risk, even though this year's report has a centre piece on climate," said John Drzik, chairman of Marsh & McLennan Insights.

SEE: A winning strategy for cybersecurity(ZDNet special report) |Download the report as a PDF(TechRepublic)

The 2020 edition of the Global Risks Report puts the technological risks behind five different environmental challenges: extreme weather, climate change action failure, natural disasters, biodiversity loss, and human-made environmental disasters.

But that isn't to say cybersecurity threats don't pose risks; cyberattacks and data breaches are still in the top ten and have the potential to cause big problems for individuals, businesses and society as a whole, with threats ranging from data breaches and ransomwareto hackers tampering with industrial and cyber-physical systems.

"The digital nature of 4IR [fourth industrial revolution] technologies makes them intrinsically vulnerable to cyberattacks that can take a multitude of formsfrom data theft and ransomware to the overtaking of systems with potentially large-scale harmful consequences," warns the report.

"Operational technologies are at increased risk because cyberattacks could cause more traditional, kinetic impacts as technology is being extended into the physical world, creating a cyber-physical system."

The report warns that, for many technology vendors, "security-by-design" is still a secondary concern compared with getting products out to the market.

Large numbers of Internet of Things product manufacturers have long had a reputation for putting selling the products ahead of ensuring they're secure and the WEF warns that the IoT is "amplifying the potential cyberattack surface", as demonstrated by the rise in IoT-based attacks.

In many cases, IoT devices collect and share private data that's highly sensitive, like medical records, information about the insides of homes and workplaces, or data on day-to-day journeys.

Not only could this data be dangerous if it falls into the hands of cyber criminals if it isn't collected and stored appropriately, the WEF also warns about the potential of IoT data being abused by data brokers. In both cases, the report warns the misuse of this data could be to create physical and psychological harm.

Artificial intelligence is also detailed as a technology that could have benefits as well as causing problems, with the report describing AI as "the most impactful invention" and our "biggest existential threat". The WEF even goes so far as to claim we're still not able to comprehend AI's full potential or full risk.

The report notes that risks around issues such as generating disinformation and deepfakes are well known, but suggests that more investigation is needed into the risks AI poses in areas including brain-computer interfaces.

A warning is also issued about the unintended consequences of quantum computing, should it arrive at some point over the course of the next decade, as some suggest. While, like other innovations, it will bring benefits to society, it also creates a problem for encryption in its current state.

SEE:Cybersecurity in an IoT and Mobile World (ZDNet sepcial report)

By dramatically reducing the time required to solve the mathematical problems that today's encryption relies on to potentially just seconds, it will render cybersecurity as we know it obsolete. That could have grave consequences for re-securing almost every aspect of 21st century life, the report warns especially if cyber criminals or other malicious hackers gain access to quantum technology that they could use to commit attacks against personal data, critical infrastructure and power grids,

"These technologies are really reshaping industry, technology and society at large, but we don't have the protocols around these to make sure of a positive impact on society," said Mirek Dusek, deputy head of the centre for geopolitical and regional affairs at member of the executive committee at the World Economic Forum.

However, it isn't all doom and gloom; because despite the challenges offered when it comes to cyberattacks, the World Economic Forum notes that efforts to address the security challenges posed by new technologies is "maturing" even if they're still sometimes fragmented.

"Numerous initiatives bring together businesses and governments to build trust, promote security in cyberspace, assess the impact of cyberattacks and assist victims," the report says.

See the article here:
The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat - ZDNet

Xanadu Receives $4.4M Investment to Advance its Photonic Quantum Computing Technology – HPCwire

TORONTO,Jan. 16, 2020 Xanadu, a Canadian quantum hardware and technology company has received a$4.4 millioninvestment from Sustainable Development Technology Canada (SDTC). The investment will expedite the development of Xanadus photonic quantum computers and make them available over the cloud. This project will also further the companys overall progress towards the construction of energy-efficient universal quantum computers.

Canadian cleantech entrepreneurs are tackling problems acrossCanadaand in every sector. I have never been more positive about the future. The quantum hardware technology that Xanadu is building will develop quantum computers with the ability to solve extremely challenging computational problems, completing chemical calculations in minutes which would otherwise require a million CPUs in a data center, saidLeah Lawrence, President and CEO, Sustainable Development Technology Canada.

Despite efforts to improve the power efficiency of traditional computing methods, the rapid growth of data centres and cloud computing presents a major source of new electricity consumption. In comparison to classical computing, quantum computing systems have the benefit of performing certain tasks and algorithms at an unprecedented rate. This will ultimately reduce the requirements for electrical power and the accompanying air and water emissions associated with electricity production.

Xanadu is developing a unique type of quantum computer, based on photonic technology, which is inherently more power-efficient than electronics. Xanadus photonic approach uses laser light to carry information through optical chips, rather than the electrons or ions used by their competitors. By using photonic technology, Xanadus quantum computers will one day have the ability to perform calculations at room temperature, and eliminate the bulky and power-hungry cooling systems required by most other types of quantum computers.

The project will be undertaken by Xanadus team of in-house scientists, with collaboration from theUniversity of Torontoand Swiftride. The project will be carried out over three years and will encompass the development of Xanadus architecture, hardware, software and client interfaces with the overall goal of expediting the development of the companys technology, and demonstrating the practical benefits of quantum computing for users and customers by the end of 2022.

We are thrilled by the recognition and support that we are receiving from SDTC for the development of our technology. We firmly believe that our unique, photonic-based approach to quantum computing will deliver both valuable insights and tangible environmental benefits for our customers and partners, said Christian Weedbrook, CEO of Xanadu.

About Xanadu

Xanadu is a photonic quantum hardware company. We build integrated photonic chips that can be used in quantum computing, communication and sensing systems. The companys mission is to build quantum computers that are useful and available to people everywhere, visit http://www.xanadu.aior follow us on Twitter@XanaduAI.

About SDTC

Sustainable Development Technology Canada (SDTC) is a foundation created by the Government ofCanadato advance clean technology innovation inCanada by funding and supporting small and medium-sized enterprises developing and demonstrating clean technology solutions. Follow Sustainable Development Technology Canada on Twitter: @SDTC

Source: Xanadu

Read more here:
Xanadu Receives $4.4M Investment to Advance its Photonic Quantum Computing Technology - HPCwire

Quantum Computing and Israel’s Growing Tech Role | Sam Bocetta – The Times of Israel

Its time to adjust to a world that is changing from the digital landscape that we have grown accustomed to. Traditional computing is evolving as quantum computing takes center stage.

Traditional computing uses the binary system, a digital language made up of strings of 1s and 0s. Quantum computing is a nonbinary system that uses the qubit which has the ability to exist as both 1 and 0 simultaneously, giving it a near-infinite number of positions and combinations. This computational ability far exceeds any other similar technology on the market today.

This new technology threatens to outpace our efforts in cyber defense and poses an interesting challenge to VPN companies, web hosts, and other similar industries that rely on traditional methods of standard encryption.

While leading tech giants all over the globe continue to implement funding that pours hundreds of billions of dollars into their R&D programs for quantum computing, Israel is quick to recognize the importance of the emerging industry. The Startup Nations engineers can be found toiling away in the fight to be at the frontier of the worlds next big technological innovations.

Quantum computing provides unmatched efficiency at analyzing data. To understand the scope of it, consider the aforementioned classical computing style that encodes information in binary. Picture a string of 1s and 0s about 30 digits long. This string alone has almost one billion different combinations. A classical computer can only analyze each possibility one at a time. However, a quantum computer, thanks to a phenomenon known as superposition, can exist in each one of those billion states simultaneously. To match this unparalleled computing power, our classical computer would need 1 billion processors.

Consider how much time we spend using applications on the internet. Our data is constantly being stored, usually in large data centers far from us thanks to the ability of cloud computing, which allows information to be stored at data centers and analyzed at a great distance from the user.

Tech ventures, such as Microsoft Azure and Amazon AWS, compete for the newest developments in this technology knowing the positive effects it has on the web users experience, such as access to the fastest response times, speedy data transfer, and the most powerful processing capabilities for AI.

Quantum computing has future applications in almost every facet of civilian life imaginable, including pharmaceuticals, energy, space, and more. Quantum computers could offer scientists the ability to work up close with virtual models unlike any theyve had before, with the ability to analyze anything from complex chemical reactions to quantum systems. AI, the technology claiming to rival electricity in importance and implementation, is the ideal candidate for quantum computing due to it often requiring complex software too challenging for current systems.

Really, the world is quantum computings oyster.

The next Silicon Valley happens to be on the other side of the world from California. Israel has gained the attention of major players in the tech sector, including giants such as Intel, Amazon, Google, and Nvidia. The Startup Nation got its nickname due to a large number of startups compared to the population, with approximately 1 startup for every 1,400 residents. In a list of the top 50 global cities for the growing tech industry, Tel Aviv, Israel comes in at #15. Israel is wrapping up the year of 2019 with an astonishing 102% jump in the number of tech mergers and acquisitions as compared to the previous year, with no signs of slowing down.

Habana Labs and Annapurna Labs, both created by entrepreneur Avigdor Willenz, were recently acquired by Intel and Amazon respectively to further their development in the realm of quantum computing and more powerful processors. Google, Nvidia, Marvell, Huawei, Broadcom, and Cisco have also invested billions of capital into Israeli prospects.

One of Googles R&D centers located in Tel Aviv is actively heading the research on quantum computing. Just this year Google announced a major breakthrough that made other tech giants pick up the pace. They hinted at a computer chip that, with the power of quantum computing, was able to manage and analyze in one second the amount of data that would take a full day for any supercomputer.

While Israel is reaping the benefits of its current exposure thanks to big tech firms, an anonymous source is skeptical about the long-term success of Israels foray into the tech world without the increased education and government support to keep up with the demand. Similar to other parts of the world, Israel has a shortage of the necessary engineers to drive development.

Recognizing the need to act fast, in 2017 Professor Uri Sivan of the Technicon Israel Institute of Technology led a committee dedicated to documenting the strengths and weaknesses of the current state of Israels investment in quantum technology research and development. What the committee found was a lag in educational efforts and a need for more funding to keep pace with the fast growth of the industry.

In response to this need for funding, in 2018 Israels Defense Ministry and the Israel Science Foundation announced a multi-year fund that would dedicate in total $100 million to the research of quantum technologies in hopes that this secures Israels global position as a top contributor to new technologies.

Classic cryptography relies on the symbiotic relationship between a public-key, a private key, and a classical computers inability to reverse-engineer the private key to decrypt sensitive data. While the algorithms used thus far have proved too complex for classical computing, they are no match for the quantum computer.

Organizations are recognizing this potential crisis and jumping to find a solution. The National Institute for the Standards of Technology requested potential postquantum algorithms in 2016. IBM recently announced its own system for handling quantum encryption methods, known as CRYSTALS.

Current encryption methods are the walls in place that guard our personal information, from bank records and personal documents stored online to any data sent via the web, such as emails.

Just about any user with access to the web on a regular basis can benefit from the security that a VPN offers. A VPN not only protects the identity of your IP address but also secures sensitive data that we are wont to throw into the world wide web. To understand how this works, consider the concept of a tunnel. Your data is shifted through this VPN virtual tunnel that acts as a barrier to unwanted attacks and hackers. Now, this tunnel exists using standard encryption to hide your data. Quantum computing abilities, as they become more accessible and widespread, is going to essentially destroy any effectiveness provided by industries that rely on standard encryption.

Outside of the usual surfing and data-exposing that we do on the web, lots of us are also taking advantage of opportunities to create our own websites. However, even the best web hosts leave us high and dry with the new age of quantum computing abilities and the influx of spyware and malware. WordPress, one of the more popular web hosts, can easily fall vulnerable to SQL injections, cross-site scripting attacks, and cookie hijacking. The encryptions that can be used to prevent such attacks are, you guessed it, hopeless in the face of quantum technologies.

The current state of modern technology is unsurprisingly complex and requires cybersecurity professionals with strong problem-solving skills and creativity to abate the potential threats well be facing within the next decade. In order to stay ahead of the game and guarantee an effective solution for web-users, top VPN companies and web-hosts need to invest in the research necessary to find alternatives for standard encryption. ExpressVPN has taken it a step further with a kill switch if the VPN disconnects unexpectedly and also offers VPN tunneling.

The ability for constant advancements in any field related to science and technology is what makes our world interesting. Decades ago, the abilities afforded by quantum computing would have sounded like an idea only contingent within an Isaac Asimov novel.

The reality of it is that quantum computing has arrived and science waits for no one. Professionals across digital industries need to shift their paradigms in order to account for this young technology that promises to remap the world as we know it.

Israel is full to the brim with potential and now is the time to invest resources and encourage education to bridge the gap and continue to be a major player in the global economy of quantum computing.

Read the rest here:
Quantum Computing and Israel's Growing Tech Role | Sam Bocetta - The Times of Israel

‘How can we compete with Google?’: the battle to train quantum coders – The Guardian

There is a laboratory deep within University College London (UCL) that looks like a cross between a rebel base in Star Wars and a scene imagined by Jules Verne. Hidden within the miles of cables, blinking electronic equipment and screens is a gold-coloured contraption known as a dilution refrigerator. Its job is to chill the highly sensitive equipment needed to build a quantum computer to close to absolute zero, the coldest temperature in the known universe.

Standing around the refrigerator are students from Germany, Spain and China, who are studying to become members of an elite profession that has never existed before: quantum engineering. These scientists take the developments in quantum mechanics over the past century and turn them into revolutionary real-world applications in, for example, artificial intelligence, self-driving vehicles, cryptography and medicine.

The problem is that there is now what analysts call a quantum bottleneck. Owing to the fast growth of the industry, not enough quantum engineers are being trained in the UK or globally to meet expected demand. This skills shortage has been identified as a crucial challenge and will, if unaddressed, threaten Britains position as one of the worlds top centres for quantum technologies.

The lack of access to a pipeline of talent will pose an existential threat to our company, and others like it, says James Palles-Dimmock, commercial director of London- and Oxford-based startup Quantum Motion. You are not going to make a quantum computer with 1,000 average people you need 10 to 100 incredibly good people, and thatll be the case for everybody worldwide, so access to the best talent is going to define which companies succeed and which fail.

This doesnt just matter to niche companies; it affects everyone. If the UK is to remain at the leading edge of the world economy then it has to compete with the leading technological and scientific developments, warns Professor Paul Warburton, director of the CDT in Delivering Quantum Technologies. This is the only way we can maintain our standard of living.

This quantum bottleneck is only going to grow more acute. Data is scarce, but according to research by the Quantum Computing Report and the University of Wisconsin-Madison, on one day in June 2016 there were just 35 vacancies worldwide for commercial quantum companies advertised. By December, that figure had leapt to 283.

In the UK, Quantum Motion estimates that the industry will need another 150200 quantum engineers over the next 18 months. In contrast, Bristol Universitys centre for doctoral training produces about 10 qualified engineers each year.

In the recent past, quantum engineers would have studied for their PhDs in small groups inside much larger physics departments. Now there are interdisciplinary centres for doctoral training at UCL and Bristol University, where graduates in such subjects as maths, engineering and computer science, as well as physics, work together. As many of the students come with limited experience of quantum technologies, the first year of their four-year course is a compulsory introduction to the subject.

Rather than work with three or four people inside a large physics department its really great to be working with lots of people all on quantum, whether they are computer scientists or engineers. They have a high level of knowledge of the same problems, but a different way of thinking about them because of their different backgrounds, says Bristol student Naomi Solomons.

While Solomons is fortunate to study on an interdisciplinary course, these are few and far between in the UK. We are still overwhelmingly recruiting physicists, says Paul Warburton. We really need to massively increase the number of PhD students from outside the physics domain to really transform this sector.

The second problem, according to Warburton, is competition with the US. Anyone who graduates with a PhD in quantum technologies in this country is well sought after in the USA. The risk of lucrative US companies poaching UK talent is considerable. How can we compete with Google or D-Wave if it does get into an arms race? says Palles-Dimmock. They can chuck $300,000-$400,000 at people to make sure they have the engineers they want.

There are parallels with the fast growth of AI. In 2015, Ubers move to gut Carnegie Mellon Universitys world-leading robotics lab of nearly all its staff (about 50 in total) to help it build autonomous cars showed what can happen when a shortage of engineers causes a bottleneck.

Worryingly, Doug Finke, managing editor at Quantum Computing Report, has spotted a similar pattern emerging in the quantum industry today. The large expansion of quantum computing in the commercial space has encouraged a number of academics to leave academia and join a company, and this may create some shortages of professors to teach the next generation of students, he says.

More needs to be done to significantly increase the flow of engineers. One way is through diversity: Bristol has just held its first women in quantum event with a view to increasing its number of female students above the current 20%.

Another option is to create different levels of quantum engineers. A masters degree or a four-year dedicated undergraduate degree could be the way to mass-produce engineers because industry players often dont need a PhD-trained individual, says Turner. But I think you would be training more a kind of foot soldier than an industry leader.

One potential roadblock could be growing threats to the free movement of ideas and people. Nations seem to be starting to get a bit protective about what theyre doing, says Prof John Morton, founding director of Quantum Motion. [They] are often using concocted reasons of national security to justify retaining a commercial advantage for their own companies.

Warburton says he has especially seen this in the US. This reinforces the need for the UK to train its own quantum engineers. We cant rely on getting our technology from other nations. We need to have our own quantum technology capability.

Read more here:
'How can we compete with Google?': the battle to train quantum coders - The Guardian

Why India is falling behind in the Y2Q race – Livemint

Now, the world faces a new scare that some scientists are calling the Y2Q (years to quantum") moment. Y2Q, say experts, could be the next major cyber disruption. When this moment will come is not certain; most predictive estimates range from 10 to 20 years. But one thing is certain: as things stand, India has not woken up to the implications (both positive and negative) of quantum computing.

What is quantum computing? Simply put, it is a future technology that will exponentially speed up the processing power of classical computers, and solve problems in a few seconds that todays fastest supercomputers cant.

Most importantly, a quantum computer would be able to factor the product of two big prime numbers. And that means the underlying assumptions powering modern encryption wont hold when a practical quantum computer becomes a reality. Encryption forms the backbone of a secure cyberspace. It helps to protect the data we send, receive or store.

So, a quantum computer could translate into a complete breakdown of current encryption infrastructure. Cybersecurity experts have been warning about this nightmarish scenario since the late 1990s.

In October, Google announced a major breakthrough, claiming its quantum computer can solve a problem in 200 seconds, which would take even the fastest classical computer 10,000 years. That means their computer had achieved quantum supremacy", claimed the companys scientists. IBM, its chief rival in the field, responded that the claims should be taken with a large dose of skepticism". Clearly, Googles news suggests a quantum future is not a question of if, but when.

India lags behind

As the US and China lead the global race in quantum technology, and other developed nations follow by investing significant intellectual and fiscal resources (see Future Danger), India lags far behind. Indian government is late, but efforts have begun in the last two years," said Debajyoti Bera, a professor at Indraprastha Institute of Information Technology (IIIT) Delhi, who researches quantum computing.

Mints interviews with academic researchers, private sector executives and government officials paint a bleak picture of Indias ability to be a competent participant. For one, the ecosystem is ill-equipped: just a few hundred researchers living in the country work in this domain, that too in discrete silos.

There are legacy reasons: Indias weakness in building hardware and manufacturing technology impedes efforts to implement theoretical ideas into real products. Whatever little is moving is primarily through the government: private sector participationand investmentremains lacklustre. And, of course, theres a funding crunch.

All this has left Indias top security officials concerned. Lieutenant General (retd) Rajesh Pant, national cybersecurity coordinator, who reports to the Prime Ministers Office, identified many gaps in the Indian quantum ecosystem. There is an absence of a quantum road map. There is no visibility in the quantum efforts and successes, and there is a lack of required skill power," Pant said at an event in December, while highlighting the advances China has made in the field. As the national cybersecurity coordinator, this is a cause of concern for me."

The task at hand

In a traditional computerfor instance, your phone and laptopevery piece of information, be it text or video, is ultimately a larger string of bits": each bit can be either zero or one. No other value is possible. In a quantum computer, bits" are replaced by qubits" where each unit can exist in both states, zero and one, at the same time. That makes the processing superfast: qubits can encode and process more information than bits.

Whats most vulnerable is information generated today that has long-term value: diplomatic and military secrets or sensitive financial and healthcare data. The information circulating on the internet that is protected with classical encryption can be harvested by an adversary. Whenever the decryption technology becomes available with the advent of quantum computers, todays secrets will break apart," explains Vadim Makarov, the chief scientist running Russias quantum hacking lab.

From a national security perspective, there are two threads in global efforts. One is to build a quantum computer: whoever gets there first will have the capability to decrypt secrets of the rest. Two, every country is trying to make ones own communications hack-proof and secure.

The Indian game plan

There are individual programmes operating across government departments in India. The ministry of electronics and information technology is interested in computing aspects; DRDO in encryption products and Isro in satellite communication," said a senior official at the department of science and technology (DST) who is directly involved in formulating Indias quantum policy initiatives, on condition of anonymity. DRDO is Defence Research and Development organisation, and Isro is Indian Space Research Organisation. DST, which works under the aegis of the central ministry of science and technology, mandate revolves around making advances in scientific research.

To that end, in 2019, DST launched Quantum Information Science and Technology (QuEST), a programme wherein the government will invest 80 crore in the next three years to fund research directed to build quantum computers, channels for quantum communication and cryptography, among other things. Some 51 projects were selected for funding under QuEST. A quarter of the money has been released, said the DST official.

K. VijayRaghavan, principal scientific adviser, declined to be interviewed for this story. However, in a recent interview to The Print, he said: It[QuEST] will ensure that the nation reaches, within a span of 10 years, the goal of achieving the technical capacity to build quantum computers and communications systems comparable with the best in the world, and hence earn a leadership role."

Not everyone agrees. While QuEST is a good initiative and has helped build some momentum in academia, it is too small to make any meaningful difference to the country," said Sunil Gupta, co-founder and chief executive of QNu Labs, a Bengaluru-based startup building quantum-safe encryption products. India needs to show their confidence and trust in startups." He added that the country needs to up the ante by committing at least $1 billion in this field for the next three years if India wants to make any impact on the global level".

More recently, DRDO announced a new initiative: of the five DRDO Young Scientists Laboratories that were launched by Prime Minister Narendra Modi in January with the aim to research and develop futuristic defence technologies. One lab set up at Indian Institute of Technology Bombay is dedicated to quantum technology.

The DST official said that the government is planning to launch a national mission on quantum technology. It will be a multi-departmental initiative to enable different agencies to work together and focus on the adoption of research into technology," the official said, adding that the mission will have clearly defined deliverables for the next 5 to 10 years." While the details are still in the works, the official said equipping India for building quantum-secure systems is on the cards.

The flaws in the plan

Why is India lagging behind? First, India doesnt have enough people working on quantum technology: the estimates differ, but they fall in the range of 100-200 researchers. That is not enough to compete with IBM," said Anirban Pathak, a professor at Jaypee Institute of Information Technology, and a recipient of DSTs QuEST funding.

Contrast that with China. One of my former students is now a faculty member in a Chinese university. She joined a group that started just two years ago and they are already 50 faculty members in the staff," added Pathak. In India, at no place, you will find more than three faculty members working in quantum."

IIIT Delhis Bera noted: A lot of Indians in quantum are working abroad. Many are working in IBM to build a quantum computer. India needs to figure out a way to get those people back here."

Secondly, theres the lack of a coordinated effort. There are many isolated communities in India working on various aspects: quantum hardware, quantum key distribution, information theory and other fields," said Bera. But there is not much communication across various groups. We cross each other mostly at conferences."

Jaypees Pathak added: In Delhi, there are eight researchers working in six different institutes. Quantum requires many kinds of expertise, and that is needed under one roof. We need an equivalent of Isro (for space) and Barc (for atomic research) for quantum."

Third is Indias legacy problem: strong on theory, but weak in hardware. That has a direct impact on the countrys ability to advance in building quantum technology. The lack of research is not the impediment to prepare for a quantum future, say experts. Implementation is the challenge, the real bottleneck. The DST official quoted earlier acknowledged that some Indian researchers he works with are frustrated.

They need infrastructure to implement their research. For that, we need to procure equipment, instal it and then set it up. That requires money and time," said the official. Indian government has recognized the gap and is working towards it."

Bera said that India should start building a quantum computer. But the problem is that the country doesnt even have good fabrication labs. If we want to design chips, Indians have to outsource," he said. Hardware has never been Indias strong point." QNu Labs is trying to fill that gap. The technology it is developing is based on research done over a decade ago: the effort is to build hardware and make it usable.

Finally, Indias private sector and investors have not stepped up in the game. If India wants something bigger, Indian tech giants like Wipro and Infosys need to step in. They have many engineers on the bench who can be involved. Academia alone or DST-funded projects cant compete with IBM," said Pathak.

The DST official agreed. R&D is good for building prototypes. But industry partnership is crucial for implementing it in the real world," he said. One aim of the national quantum mission that is under the works would be to spin-off startup companies and feed innovation into the ecosystem. We plan to bring venture capitalists (VCs) under one umbrella."

In conclusion

Pant, the national cybersecurity chief, minced no words at the event in December 2019 on quantum technology.

In 1993, there was an earthquake in Latur and we created the National Disaster Management Authority which now has a presence across the country." He added: Are we waiting for a cybersecurity earthquake to strike before we get our act together?"

Samarth Bansal is a freelance journalist based in Delhi. He writes about technology, politics and policy

Here is the original post:
Why India is falling behind in the Y2Q race - Livemint

2020s — the Decade of AI and Quantum – Inside Higher Ed

Too often, we look ahead assuming that the technologies and structures of today will be in place for years to come. Yet a look back confirms that change has moved at a dramatic pace in higher education.

Reviewing the incredible progress each decade brings makes me wonder, if I knew at the beginning of the decade what was coming, how might I have better prepared?

Make no mistake, we have crossed the threshold into the fourth industrial revolution that will most markedly advance this decade through maturing artificial intelligence, ultimately driven by quantum computing. The changes will come at an ever-increasing rate as the technologies and societal demands accelerate. Digital computers advanced over the past half century at approximately the rate described by Moores Law, with processing power doubling every two years. Now we are entering the era of Nevens Law, which predicts the speed of progress of quantum computing at a doubly exponential rate. This means change at a dizzyingly rapid rate that will leave many of us unable to comprehend the why and barely able to digest the daily advances that will describe reality. New platforms, products and processes will proliferate in this new decade.

That includes higher education. The centuries-old model of the faculty member at a podium addressing a class of students who are inconsistently and inaccurately taking notes on paper or laptop will seem so quaint, inefficient and impractical that it will be laughable. Observers in 2030 will wonder how any significant learning even took place in that environment.

Semesters and seat time will not survive the coming decade. Based in 19th- and 20th-century societal needs, these are long overdue to pass away. The logical and efficient structure of outcomes-based adaptive learning will quickly overtake the older methods, doing away with redundancy for the advanced students and providing developmental learning for those in need. Each student will be at the center of their learning experience, with AI algorithms fed by rich data about each student mapping progress and adjusting the pathway for each learner. This will lead to personalized learning where the courses and curriculum will be custom-made to meet the needs of the individual learner. Yet, it also will also serve to enhance the social experience for learners meeting face-to-face. In a report from Brookings on the topic, researchers stated that technology can help education leapfrog in a number of ways. It can provide individualized learning by tracking progress and personalizing activities to serve heterogeneous classrooms.

Early implementations of adaptive learning in the college setting have shown that this AI-driven process can result in greater equity success for the students. In addition, the faculty members see that their role has become even more important as they directly interact with the individual students to enable and facilitate their learning.

Increasingly we are gathering data about our students as they enter and progress through learning at our institutions. That big data is the "food" upon which artificial intelligence thrives. Sorting through volumes and varieties of data that in prior decades we could not efficiently process, AI can now uncover cause and effect pairs and webs. It can lead us to enhancements and solutions that previously were beyond our reach. As the pool of data grows and becomes more and more diverse -- not just numbers, but also videos and anecdotes -- the role of quantum computing comes into play.

While it is unlikely we will see quantum computers physically on the desks of university faculty and staff in the coming decade, we certainly will see cloud use of quantum computers to solve increasingly complex problems and opportunities. Quantum computers will interact with digital computers to apply deep learning at an as yet unseen scale. We will be able to pose challenges such as "what learning will researchers need to best prepare for the next generation of genetic advancement?" Faster than a blink of an eye, the quantum computers will respond.

It turns out that major developments are occurring every day in the advancement of quantum computing. Johns Hopkins University researchers recently discovered a superconducting material that may more effectively host qubits in the future. And Oxford University researchers just uncovered ways in which strontium ions can be much more efficiently entangled for scaling quantum computers. Advancements such as these will pave the path to ever more powerful computers that will enable ever more effective adaptive, individualized and personalized learning.

We know that change is coming. We know the direction of that change. We know some of the actual tools that will be instrumental in that change. Armed with that knowledge, what can we do today to prepare for the decade of the 2020s? Rather than merely reacting to changes after the fact, can we take steps to anticipate and prepare for that change? Can our institutions be better configured to adapt to the changes that are on the horizon? And who will lead that preparation at your institution?

Read the rest here:
2020s -- the Decade of AI and Quantum - Inside Higher Ed

Alibaba’s 10 Tech Trends to Watch in… – Alizila

The Alibaba DAMO Academy, Alibaba Groups global program for tackling ambitious, high-impact technology research, has made some predictions about the trends that will shape the industry in the year ahead. From more-advanced artificial intelligence to large-scale blockchain applications, heres what you can expect in 2020.

1. Artificial Intelligence Gets More Human2020 is set to be a breakthrough year for AI, according to DAMO. Researchers will be taking inspiration from a host of new areas to upgrade the technology, namely cognitive psychology and neuroscience combined with insights into human behavior and history. Theyll also adopt new machine-learning techniques, such as continual learning, which allows machines to remember what theyve learned in order to more quickly learn new things something humans take for granted. With these advances in cognitive intelligence, machines will be able to better understand and make use of knowledge rather than merely perceive and express information.

2. The Next Generation of ComputationComputers these days send information back and forth between the processor and the memory in order to complete tasks. The problem? Computing demands have grown to such an extent in the digital age that our computers cant keep up. Enter processing-in-memory architecture, which integrates the processor and memory into a single chip for faster processing speed. PIM innovations will play a critical role in spurring next-generation AI, DAMO said.

3. Hyper-Connected ManufacturingThe rapid deployment of 5G, Internet of Things and cloud- and edge-computing applications will help manufacturers go digital, including everything from automating equipment, logistics and production scheduling to integrating their factory, IT and communications systems. In turn, DAMO predicts theyll be faster to react to changes in demand and coordinate with suppliers in real time to help productivity and profitability.

WATCH: An Inside Look at Cainiaos Hyperconnected Warehouse

4. Machines Talking to Machines at ScaleMore-advanced IoT and 5G will enable more large-scale deployments of connected devices, which brings with them a range of benefits for governments, companies and consumers. For example, traffic-signal systems could be optimized in real time to keep drivers moving (and happy), while driverless cars could access roadside sensors to better navigate their surroundings. These technologies would also allow warehouse robots to maneuver around obstacles and sort parcels, and fleets of drones to efficiently and securely make last-mile deliveries.

5. Chip Design Gets EasierHave you heard? Moores Law is dying. It is now becoming too expensive to build faster and smaller semiconductors. In its place, chipmakers are now piecing together smaller chiplets into single wafers to handle more-demanding tasks. Think Legos. Another advantage of chiplets is that they often use already-inspected silicon, speeding up time to market. Barriers to entry in chipmaking are dropping, too, as open-source communities provide alternatives to traditional, proprietary design. And as more companies design their own custom chips, they are increasingly contributing to a growing ecosystem of development tools, product information and related software that will enable still easier and faster chip design in the future.

6. Blockchain Moves Toward MainstreamThe nascent blockchain industry is about to see some changes of its own. For one, expect the rise of the blockchain-as-a-service model to make these applications more accessible to businesses. Also, there will be a rise in specialized hardware chips for cloud and edge computing, powered by core algorithms used in blockchain technologies. Scientists at DAMO forecast that the number of new blockchain applications will grow significantly this year, as well, while blockchain-related collaborations across industries will become more common. Lastly, the academy expects large-scale blockchain applications to see wide-scale adoption.

7. A Turning Point for Quantum ComputingRecent advancements in this field have stirred up hopes for making large-scale quantum computers a reality, which will prompt more investments into quantum R&D, according to DAMO. That will result in increased competition and ecosystem growth around quantum technologies, as well as more attempts to commercialize the technology. DAMO predicts that after a difficult but critical period of intensive research in the coming years, quantum information science will deliver breakthroughs such as computers that can correct computation errors in real time.

8. More Revolution in SemiconductorsDemand is surging for computing power and storage, but major chipmakers still havent developed a better solution than 3-nanometer node silicon-based transistors. Experiments in design have led to the discovery of other materials that might boost performance. Topological insulators and two-dimensional superconducting materials, for example, may become connective materials as their properties allow electrical currents to flow without resistance. New magnetic and resistive switching materials might also be used to create next-generation magnetic memory technology, which can run on less power than their predecessors.

9. Data Protection Powered by AIAs businesses face a growing number of data-protection regulations and the rising compliance costs to meet them interest is growing in new solutions that support data security. AI algorithms can do that. They help organizations manage and filter through information, protect user information shared across multiple parties and make regulatory compliance easier, or even automatic. These technologies can help companies promote trust in the reuse and sharing of analytics, as well as overcome problems such as data silos, where certain information is not accessible to an entire organization and causes inefficiencies as a result.

10. Innovation Starts on the CloudCloud computing has evolved far beyond its intended purpose as technological infrastructure to take on a defining role in IT innovation. Today, clouds computing power is the backbone of the digital economy as it transforms the newest, most-advanced innovations into accessible services. From semiconductor chips, databases and blockchain to IoT and quantum computing, nearly all technologies are now tied to cloud computing. It has also given rise to new technologies, such as serverless computing architecture and cloud-powered robotic automation.

Originally posted here:
Alibaba's 10 Tech Trends to Watch in... - Alizila