Recently Intel Corp. delivered fifty million artificial neurons to Sandia National Laboratories, which is equivalent to the brain of a small mammal. The shipment is first in a three-year series, by the end of which they are expecting the number of experimental neurons in the final model to reach 1 billion or more. This collaboration aims to boost neuromorphic computing solutions to newer heights while prototyping the software, algorithms, and architectures. \u201cWith a neuromorphic computer of this scale, we have a new tool to understand how brain-based computers can do impressive feats that we cannot currently do with ordinary computers,\u201d said Craig Vineyard, project leader at Sandia. Researchers believe that improved algorithms and computer circuitry can create broader applications for neuromorphic computers. They also hope to determine how brain-inspired processors use the information at a processing power of human brains. With these developments, let us further explore neuromorphic computers and how it aims to revolutionise the AI application areas. Neuromorphic Computing Promises An AI Revolution While the current CPUs and GPUs are powering supercomputers to exceptional levels, and they have brought about the tremendous growth in terms of AI applications, there are shortfalls that researchers still face. Machine reasoning, transfer learning, physical dimensions, excessive energy consumption, to name a few. The AI developed currently is narrow and learns only from data provided to it.\u00a0 The current machine learning algorithms or deep neural networks contain multiple layers of processing. The accuracy of these neural networks increases only if they train on more data that need massive computing power. Enters neuromorphic AI, which aims to bring a new wave of AI applications. Neuromorphic computers aim to facilitate the highest computing speeds while reducing the need for bulky devices and dedicated buildings. It is interesting to note that current supercomputers need power in megawatts, whereas the human brain consumes about 20 watts of power, which researchers are obsessed with replicating in computers.\u00a0 How Do They Work? Neuromorphic computing essentially involves assembling artificial neurons to function based on the principles of the human brain. Its artificial components pass information in a manner similar to the action of living neurons, electrically pulsing only when a synapse in a complex circuit has absorbed enough charge to produce an electrical spike. It tries to mimic the way the human brain works, which have over 100 billion neurons and neuromodulators that change its form according to the function that it has to perform.\u00a0 It works on Spiking Neural Networks or SNNs, where each \u201cneuron\u201d sends independent signals to other neurons. It emulates natural neural networks that exist in biological brains. According to a blogpost by Intel, Each \u201cneuron\u201d in the SNN can fire independently of the others, and doing so; it sends pulsed signals to other neurons in the network that directly change the electrical states of those neurons. By encoding information within the signals themselves and their timing, SNNs simulate natural learning processes by dynamically remapping the synapses between artificial neurons in response to stimuli. The responses to spikes can be modulated to represent a continuum of values rather than \u20180\u2019 or \u20181\u2019 and hence provide an analogue flavour that is closer to the way the brain works. Also, as neurons work only when spiked, they are not continually consuming energy and thus saving power. It, therefore, uses far less electrical power and weighs much less than today\u2019s personal computers. While it looks like an exciting area to explore, researchers have had limited success with it as human brains operate differently from mainstream computer architectures that create the currently existing algorithms. The critical challenge in neuromorphic research is matching a human\u2019s flexibility and learning from unstructured stimuli with the human brain\u2019s energy efficiency. \u201cBut brains operate differently, more like the interconnected graphs of airline routes, than any yes-no electronic circuit typically used in computing. So, a lot of brain-like algorithms struggle because current computers aren\u2019t designed to execute them,\u201d said Vineyard. Application Areas In terms of neural-inspired computing uses, we are still in the infancy stages. While Intel and Sandia\u2019s collaboration aims to explore AI in commercial and defence areas, there are not many use cases that can be listed. However, it has been explored in areas such as self-driving cars, classifying vapours, identifying an individual\u2019s face out of a lot of random images, and more. Researchers are hopeful that neuromorphic computers will improve machine learning in more complex fields, such as remote sensing and intelligence analysis. It has also being explored in computational physics simulations and other numerical algorithms.\u00a0 The Way Forward There are many companies and projects that are leading applications in this space. For instance, as a part of the Loihi project by Intel, it has created a Liohi chip with 130000 neurons and 130 million synapses and excels at self-learning. As the blogpost notes \u201c\u2026 because the hardware is optimised specifically for SNNs, it supports dramatically accelerated learning in unstructured environments for systems that require autonomous operation and continuous learning, with extremely low power consumption, plus high performance and capacity.\u201d TrueNorth\u2019s neurons by IBM aims to revolutionise the brain-inspired computing system. DARPA SyNAPSE, which consists of one million neurons brain-inspired processor consumes merely 70 milliwatts. Also, it can perform 46 billion synaptic operations per second, per watt.\u00a0There are other companies such as HPE, Qualcomm, and Samsung Electronics, among others exploring the area of neuromorphic computing. In fact, according to a study, the global market for neuromorphic chips which was estimated at $2.3 billion in the year 2020, is projected to reach a revised size of $10.4 billion by 2027. These numbers only suggest that neuromorphic computers are a way ahead in AI-based research and development.