Brain-Inspired Cognitive Architecture Is Now Solving Computational Challenges Faced By AI

With the development of artificial intelligence intensifying across the globe, IT companies are looking for ways to revamp their architecture to make more robust. Increasingly, researchers are turning to brain-inspired architecture with co-located memory and processing, resulting in computers which are 200 times faster than conventional computers. Such is the excitement around AI hardware, that this phase has been dubbed as a “renaissance of hardware” as vendors are rushing to build domain-specific or workload-specific architectures that can significantly scale and improve computational efficiency.

And as we nudge forward in the mobile era, the workloads are going to look extremely dissimilar since the requirements of computing are changing. Businesses have to rely on a different architecture, each meant for a particular workload. This is where vendors are making a shift from Von Neumann computing architecture and are striving to improve the performance of computing with multi-core CPU architectures.

AIM Daily XO

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Now, the rapid gains in neuroscience have also spurred researchers across the globe to propose Brain-inspired computing architecture to develop highly advanced cognitive systems. IBM researchers are working on a new computer architecture which can process data efficiently for AI workloads. However, what’s remarkable is that this new architecture is inspired by the brain and will feature coexisting processing and memory units.


Download our Mobile App

The IBM team argues that traditional computers were built on Von Neumann architecture, developed in the 1940s and featured a central processor that executes logic and arithmetic, a memory unit, storage, and input and output devices. But the current industry requirement has necessitated a move from homogeneous to heterogeneous computing architecture which has led to an increased research in applied materials and neuroscience.

Why Does AI Workload Require New Computing Architecture?

So what kind of changes are required in computing architecture for AI workload? According to researchers, there have to be consistent, ground-breaking breakthroughs in material sciences and neuroscience to advance AI processing. Let’s look at the two key requirements for AI workloads which have arisen an interest in brain-inspired computer architecture. Technologists emphasise that multi-core CPUs have reached their performance and efficiency limit and are adding to architectural challenges.

  1. Memory requirement: Since AI workloads depend on a lot of data — for processing huge amount of data, AI workloads require faster access to memory. In traditional CPUs have multi-level cache architecture which is not well-suited for AI.
  2. Parallel processing requirement: Parallel computing is on a rise and AI workloads and architectures have to be designed which can execute parallelism at scale.

Brain-Inspired Computing Architecture

There has been a rise in the development of cognitive which researchers assert emulates brain architecture modelling, cognitive architecture design and cognitive architecture fostering (by making them learn in a certain environment) and application to products. IBM’s new paper discusses the three layers of inspiration from the human brain.

Firstly, the team took inspiration from the brain’s memory and processing, emulating a memory device to perform computational tasks in the memory itself. The second feature was inspired by the brain’s synaptic network structures and developed as arrays of phase change memory (PCM) devices to speed up the training process for deep neural networks. And finally, the researchers drew on the stochastic nature of neurons and synapses to develop a powerful computational substrate for spiking neural networks.

Speaking about testing the systems, Abu Sebastian from IBM said, “These systems are expected to be better than conventional computing systems in some tasks, and they also surpass traditional systems in terms of efficiency.” In an experiment where the researchers ran an unsupervised machine learning algorithm on the computational memory platform, they found the brain-inspired memory platform to be 200x faster in performance than conventional computing systems.


Given the pace at which knowledge of neuroscience is rapidly increasing, thanks to frenetic research, there have been attempts to correlate AI with brain’s cognitive architecture. Researchers cite that constructing AI systems is based on the hypothesis that it is possible to build a general-purpose intelligent machine which can replicate human-level intelligence.  

A key aspect of the brain actively researched by neuroscientists is how deep learning in a way appears to replicate the cerebral neocortex which plays an important role. There is also research on the connectome, which forms the cognitive architecture of the brain and plays a role in advancing several breakthroughs in neuroscience. Experts cite that the neuron model, now widely known to be used in the artificial neural network has lots of functions despite being a simple internal structure.

Sign up for The Deep Learning Podcast

by Vijayalakshmi Anandan

The Deep Learning Curve is a technology-based podcast hosted by Vijayalakshmi Anandan - Video Presenter and Podcaster at Analytics India Magazine. This podcast is the narrator's journey of curiosity and discovery in the world of technology.

Richa Bhatia
Richa Bhatia is a seasoned journalist with six-years experience in reportage and news coverage and has had stints at Times of India and The Indian Express. She is an avid reader, mum to a feisty two-year-old and loves writing about the next-gen technology that is shaping our world.

Our Upcoming Events

24th Mar, 2023 | Webinar
Women-in-Tech: Are you ready for the Techade

27-28th Apr, 2023 I Bangalore
Data Engineering Summit (DES) 2023

23 Jun, 2023 | Bangalore
MachineCon India 2023 [AI100 Awards]

21 Jul, 2023 | New York
MachineCon USA 2023 [AI100 Awards]

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

Council Post: From Promise to Peril: The Pros and Cons of Generative AI

Most people associate ‘Generative AI’ with some type of end-of-the-world scenario. In actuality, generative AI exists to facilitate your work rather than to replace it. Its applications are showing up more frequently in daily life. There is probably a method to incorporate generative AI into your work, regardless of whether you operate as a marketer, programmer, designer, or business owner.