MITB Banner

Silicon Brains: Designing Self Organising Neural Networks

Share

A healthy child’s brain under development is capable of adding nearly 250,000 neurons every minute! At birth, a brain has almost all the neurons that it will ever have. The brain continues to grow for a few years after a person is born and by the age of 2 years old.

Thanks to the glial cells, the brain continues to grow. Glia continues to divide and multiply and is responsible for carrying out many important functions including insulating nerve cells with myelin.

The phenomenon that is intelligence emerges across many species in various ecosystems and is linked to the nervous system.

Though we have figured out what part of our body is responsible for intelligence, the ‘HOW’ part is still ambiguous. And since the world today is driven by artificial logical machines that try to mimic intelligence humans, it is quite necessary to crack the enigma that is intelligence.

In an attempt to investigate if we could replicate this kind of self-organised growth in algorithms, researchers from Caltech propose a biologically inspired developmental algorithm that can ‘grow’ a functional, layered neural network from a single initial cell. 

The objective here is to grow a neural network that autonomously grows from a single computational “cell” followed by self-organization of its architecture.

Overview Of The Model

The process begins with the growth of a layered neural network followed by self-organization of its inner connections to form defined ‘pools’ or receptive fields.

The algorithm made to learn to organize inter-layer connections to construct a convolutional pooling layer.

The act of self-organization typically consists of the following two key elements:

  • A spatiotemporal wave generator in the first layer and 
  • A local learning rule in the second layer to learn the “underlying” pattern of activity generated in the first layer.

In the above illustration, the authors in their paper, show how nodes form with time and training. The red-nodes indicate active-nodes (firing), black nodes refer to silent nodes and the arrows denote the direction of time.

The learning rule here is where the processing units are modelled as rectified linear units (ReLU) and are implemented as a modified Hebbian rule for tuning the inter-layer weights to achieve the same. The Hebbian Learning Rule is a learning rule that specifies how much the weight of the connection between two units should be increased or decreased in proportion to the product of their activation. After this, the individual ReLU units compete with one another in a winner take all fashion.

Initially, every processing unit in the second layer is connected to all input-sensor nodes in the first layer. 

Having coupled the spontaneous spatiotemporal wave generator and the local learning rule, an initially fully connected two-layer network becomes a pooling architecture.

Here the spatiotemporal wave generator is a dynamical system that is modelled around a bunch of mathematical expressions as follows:

The process of growing a layered neural network involves two major sub-processes:

  1. Every ‘node’ can divide horizontally to produce nodes that populate the same layer; 
  2. Every node can divide vertically to produce processing units that migrate upwards to populate higher layers.

To test the functionality of these networks, the two-layered network is coupled with a linear classifier that is trained to classify hand-written digits from MNIST on the basis of the representation.

The results show that self-organized networks classify with a 90% test accuracy, are statistically similar to hand-crafted pooling networks and are statistically better than random networks.

Key Takeaways

  • The algorithm is adaptable to a wide-range of input-layer geometries, robust to malfunctioning and can be used to successfully grow and self-organize pooling architectures of different pool-sizes and shapes. 
  • It also provides a procedure for constructing layered neural networks through self-organization. 
  • Broadly, this work shows that biologically inspired developmental algorithms can be applied to autonomously grow functional ‘brains’ in-silico. 

The authors in this work, state that their algorithm is robust and is capable of countering the key challenges accompanying growth. 

They also demonstrate that ‘grown’ networks are functionally similar to that of hand-programmed pooling networks on conventional image classification tasks. And since CNN’s represent a model class of deep networks, the authors believe that this technique can be broadly implemented for the self-organisation of intelligent systems.

Know more about this work here.

PS: The story was written using a keyboard.
Picture of Ram Sagar

Ram Sagar

I have a master's degree in Robotics and I write about machine learning advancements.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories

Featured

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

AIM Conference Calendar

Immerse yourself in AI and business conferences tailored to your role, designed to elevate your performance and empower you to accomplish your organization’s vital objectives. Revel in intimate events that encapsulate the heart and soul of the AI Industry.

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed