Tech News: Neuromorphic computing and the brain-on-a-chip in your pocket – IOL

By Louis Fourie Jul 10, 2020

Share this article:

JOHANNESBURG - The human brain is relatively small, uses about 20 Watts of power and can accomplish an amazing number of complex tasks. In contrast, machine learning algorithms that are growing in popularity need large powerful computers and data centres that consumes megawatts of electricity.

Artificial Intelligence (AI) produces astounding achievements in the recognising of images with greater accuracy than humans, having natural conversations, beating humans in sophisticated games, and driving vehicles in heavy traffic.

AI is indeed a disruptive power of the Fourth Industrial Revolution currently driving advances in numerous things from medicine to predicting the weather. However, all of these advances require enormous amounts of computing power and electricity to develop, train and run the algorithms.

According to Elon Musk, the computing power and electricity consumption of AI machines doubles every three to four months, thus becoming a major concern for environmentalists.

But it seems that we can learn something from nature in our endeavour to address the high consumption of electricity and the resultant contribution to the climate crisis by AI and powerful machines.

A branch of computer chip design focuses on mimicking the biological brain to create super-efficient neuromorphic chips that will bring AI from the powerful and energy-hungry machines right to our pocket.

Neuromorphic computing

Neuromorphic computing is the next generation of AI and entails very-large-scale integration (VLSI) systems containing electronic analog circuits to mimic neuro-biological architectures present in the biological nervous system.

This form of AI has more in common with human cognition than with conventional computer logic.

In November 2017 Intel Labs introduced Loihi, a fifth-generation self-learning neuromorphic research test-chip containing some 130 000 neurons, to provide a functional system for researchers to implement Spiking Neural Networks (SNN) that emulate natural neural networks in biological brains.

Each neuron in the SNN can fire or spike independently and send pulsed signals with encoded information to other neurons, thereby simulating the natural learning process by dynamically remapping the synapses between the artificial neurons in response to stimuli.

MIT & memristors

About a month ago engineers of the Massachusetts Institute of Technology (MIT) published a paper in the prestigious journal, Nature Nanotechnology, announcing that they designed a brain-on-a-chip, consisting of thousands of artificial brain synapses known as memristors.

A memristor is a silicon-based electronic memory device that mimics the information-transmitting synapses in the human brain to carry out complex computational tasks. The neuromorphic chip, smaller than a piece of confetti, is so powerful that a small portable device could now easily handle the convoluted computational tasks currently carried out by todays supercomputers.

Artificial neural networks are nothing new. However, until now synapse networks existed only as software. MIT has built real neural network hardware that made small and portable AI systems possible, thereby cutting the power consumption of AI networks by about 95 percent.

Just imagine connecting a small neuromorphic device to a camera in your car, and having it recognise lights and objects and make a decision immediately, without having to connect to the Internet. This is exactly what this new energy-efficient MIT chip will make possible on-site and in real-time.

Memristors, or memory transistors, are an essential component of neuromorphic computing. In a neuromorphic device, a memristor serves as the transistor in a circuit, however, in this case it rather resembles the functioning of a brain synapse (the junction between two neurons). The synapse receives signals from a neuron in the form of ions and sends an equivalent signal to the following neuron.

Computers in our phones and laptops currently use different digital components for processing and memory. Information is, therefore, continuously transferred between the components. The new MIT chip computes all the inputs in parallel within the memory using analog circuits in a similar way the human brain works, thus significantly reducing the amount of data that needs to be transferred, as well as a huge saving in electricity.

Since memristors are not binary as the transistors in a conventional circuit, but can have many values, they can carry out a far wider range of operations. This means that memristors could enable smaller portable devices that do not rely on supercomputers, or even connections to the Internet and cloud processing.

To overcome the challenges of reliability and scalability the MIT researchers used a new kind of silicon-based, alloyed memristor. Until now, ions flowing in memristors made from unalloyed material easily scattered as the components are getting smaller, thus leading to inferior fidelity and computational reliability. Images were often of a poorer quality.

However, an alloy of conventional silver and silicidable (a compound that has silicon with more electropositive elements) copper stabilise the flow of ions between the electrodes, allowing the scaling of the number of memristors on a small chip without sacrificing quality or functionality. The result after numerous storing and reproductions of a visual task was that the images were much crisper and clearer when compared with existing memristor designs of unalloyed elements.

The MIT researchers are not the first to create chips to carry out processing in memory to reduce power consumption of neural nets.

However, it is the first time the approach has been used to run powerful convolutional neural networks popular in image-based AI applications. This will certainly open the

possibility to use more complex convolutional neural networks for image and video classifications in the Internet of Things in the future. Although much work still needs to be done, the new MIT chip also opens up opportunities to build more AI into devices such as smartphones, household appliances, Internet of Things devices, and self-driving cars where powerful low-power AI chips are needed.

Companies & chips

MIT is not the only institution working on making AI more suitable for smaller devices. Apple has already integrated its Neural Engine into the iPhone X to power its facial recognition technology. Amazon is developing its own custom AI chips for the next generation of its Echo digital assistant.

The big chip companies are also working on the energy-efficiency of their chips since they are increasingly building advanced capabilities like machine learning into their chips. In the beginning of this year ARM unveiled new chips capable of AI tasks such as translation, facial recognition, and the detection of faces in images. Even Qualcomms new Snapdragon mobile chips are heavily focusing on AI.

Going even further, IBM and Intel are developing neuromorphic chips. IBMs TrueNorth and Intels Loihi can run powerful machine learning tasks on a fraction of the power of conventional chips.

The costs of AI and machine learning is also declining dramatically. The cost to train an image recognition algorithm decreased from around R17 000 in 2017 to about R170 in 2019.

The cost of running such an algorithm decreased even more. The cost to classify a billion images was R17 000 in 2017, but just R0.51 in 2019.

There is little doubt that as neuromorphic chips advance further in the years to come, the trends of miniaturization, increased performance, less power consumption, and much lower AI costs will continue.

Perhaps it may not be too long before we will carry some serious AI or artificial brains in our pocket that will be able to outperform current supercomputers, just as our cellphones are more powerful than the super computers of many years ago. AI will be in our pocket, as well as in numerous other devices. It will increasingly be part of our lives, making decisions on our behalf, guiding us, and automating many current tasks.

The Fourth Industrial Revolution is fundamentally changing engineering and making things possible that we could only dream of before.

Professor Louis C H Fourie is a futurist and technology strategist.

BUSINESS REPORT

Go here to read the rest:

Tech News: Neuromorphic computing and the brain-on-a-chip in your pocket - IOL

Related Posts

Comments are closed.