Recently Intel Corp. delivered fifty million artificial neurons to Sandia National Laboratories, which is equivalent to the brain of a small mammal. The shipment is first in a three-year series, by the end of which they are expecting the number of experimental neurons in the final model to reach 1 billion or more.
This collaboration aims to boost neuromorphic computing solutions to newer heights while prototyping the software, algorithms, and architectures. “With a neuromorphic computer of this scale, we have a new tool to understand how brain-based computers can do impressive feats that we cannot currently do with ordinary computers,” said Craig Vineyard, project leader at Sandia.
Researchers believe that improved algorithms and computer circuitry can create broader applications for neuromorphic computers. They also hope to determine how brain-inspired processors use the information at a processing power of human brains. With these developments, let us further explore neuromorphic computers and how it aims to revolutionise the AI application areas.
Neuromorphic Computing Promises An AI Revolution
While the current CPUs and GPUs are powering supercomputers to exceptional levels, and they have brought about the tremendous growth in terms of AI applications, there are shortfalls that researchers still face. Machine reasoning, transfer learning, physical dimensions, excessive energy consumption, to name a few. The AI developed currently is narrow and learns only from data provided to it.
The current machine learning algorithms or deep neural networks contain multiple layers of processing. The accuracy of these neural networks increases only if they train on more data that need massive computing power. Enters neuromorphic AI, which aims to bring a new wave of AI applications.
Neuromorphic computers aim to facilitate the highest computing speeds while reducing the need for bulky devices and dedicated buildings. It is interesting to note that current supercomputers need power in megawatts, whereas the human brain consumes about 20 watts of power, which researchers are obsessed with replicating in computers.
How Do They Work?
Neuromorphic computing essentially involves assembling artificial neurons to function based on the principles of the human brain. Its artificial components pass information in a manner similar to the action of living neurons, electrically pulsing only when a synapse in a complex circuit has absorbed enough charge to produce an electrical spike. It tries to mimic the way the human brain works, which have over 100 billion neurons and neuromodulators that change its form according to the function that it has to perform.
It works on Spiking Neural Networks or SNNs, where each “neuron” sends independent signals to other neurons. It emulates natural neural networks that exist in biological brains. According to a blogpost by Intel, Each “neuron” in the SNN can fire independently of the others, and doing so; it sends pulsed signals to other neurons in the network that directly change the electrical states of those neurons. By encoding information within the signals themselves and their timing, SNNs simulate natural learning processes by dynamically remapping the synapses between artificial neurons in response to stimuli.
The responses to spikes can be modulated to represent a continuum of values rather than ‘0’ or ‘1’ and hence provide an analogue flavour that is closer to the way the brain works. Also, as neurons work only when spiked, they are not continually consuming energy and thus saving power. It, therefore, uses far less electrical power and weighs much less than today’s personal computers.
While it looks like an exciting area to explore, researchers have had limited success with it as human brains operate differently from mainstream computer architectures that create the currently existing algorithms. The critical challenge in neuromorphic research is matching a human’s flexibility and learning from unstructured stimuli with the human brain’s energy efficiency.
“But brains operate differently, more like the interconnected graphs of airline routes, than any yes-no electronic circuit typically used in computing. So, a lot of brain-like algorithms struggle because current computers aren’t designed to execute them,” said Vineyard.
In terms of neural-inspired computing uses, we are still in the infancy stages. While Intel and Sandia’s collaboration aims to explore AI in commercial and defence areas, there are not many use cases that can be listed. However, it has been explored in areas such as self-driving cars, classifying vapours, identifying an individual’s face out of a lot of random images, and more.
Researchers are hopeful that neuromorphic computers will improve machine learning in more complex fields, such as remote sensing and intelligence analysis. It has also being explored in computational physics simulations and other numerical algorithms.
The Way Forward
There are many companies and projects that are leading applications in this space. For instance, as a part of the Loihi project by Intel, it has created a Liohi chip with 130000 neurons and 130 million synapses and excels at self-learning. As the blogpost notes “… because the hardware is optimised specifically for SNNs, it supports dramatically accelerated learning in unstructured environments for systems that require autonomous operation and continuous learning, with extremely low power consumption, plus high performance and capacity.”
TrueNorth’s neurons by IBM aims to revolutionise the brain-inspired computing system. DARPA SyNAPSE, which consists of one million neurons brain-inspired processor consumes merely 70 milliwatts. Also, it can perform 46 billion synaptic operations per second, per watt. There are other companies such as HPE, Qualcomm, and Samsung Electronics, among others exploring the area of neuromorphic computing. In fact, according to a study, the global market for neuromorphic chips which was estimated at $2.3 billion in the year 2020, is projected to reach a revised size of $10.4 billion by 2027. These numbers only suggest that neuromorphic computers are a way ahead in AI-based research and development.
Join Our Telegram Group. Be part of an engaging online community. Join Here.
Subscribe to our NewsletterGet the latest updates and relevant offers by sharing your email.
Srishti currently works as Associate Editor at Analytics India Magazine. When not covering the analytics news, editing and writing articles, she could be found reading or capturing thoughts into pictures.