Is neural networks the greatest algorithm of all times

Any conversation related to artificial intelligence will drum up the term ‘neural networks’. Pegged as one of the most disruptive technologies of all time, the artificial neural networks, fashioned after the human brain allow computers to learn from reams of data. One of the earliest pioneers of neural networks, Dr. Robert Hecht-Nielsen, defined neural network as “a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs.”

Yann LeCun, head of Facebook’s AI lab with Mark Zuckerberg

Analytics India Magazine delves into what’s behind the popularity of neural networks and why Artificial Neural Networks (ANN) that have the ability to derive meaning from imprecise data are used to extract patterns and detect trends.  Facebook’s Yann LeCun, inventor of convolutional network and a defining authority on neural networks said neural networks at Facebook help in translating text between languages and identifying faces and objects in photos.

AI has become Google’s playground and the search giant is known for plowing massive investment into neural networks. Earlier last year, the Mountain View company launched a series of experiments, titled Google AI Experiments that help people understand the nuances of neural networks first hand. Case in point — Handwriting with a Neural Net that allows users to play with a neural net that generates handwriting based on their style. Tensorflow is Google’s neural network framework that is designed for distributing networks across multiple machines and GPUs.

Image source: NVIDIA

Chipmaker NVIDIA that has now put all its might behind autonomous vehicles is also leading the pack when it comes to deep learning neural networks. The Santa Clara company is a leader in graphics visualization technology and put it to use for object recognition via deep learning in driverless cars.  Not just cars, researchers have deployed NVIDIA’s GPUs and trained their deep convolutional neural network to recognize 48 species in 3.2 million images taken from the park’s camera-traps.

Some of the most popular Neural Network algorithms

Neural networks are one of the learning algorithms used within machine learning. They consist of different layers for analyzing and learning data. Neural Networks learn and attribute weights to the connections between the different neurons each time the network processes data. This means the next time it comes across a particular picture, it would have learned that this particular section of the picture is probably associated with an entity.

We list down widely used algorithms :

Back Propagation algorithms: First introduced in the 1970s, today this algorithm forms the backbone of neural networks. It draws on gradient descent, mathematical technique used while training a computer. Yann LeCun explains the popular algorithm in this video. According to LeCun,  a computer performs several layers of processing to identify objects in an image. Back propagation makes this possible to know how each layer of processing affects the outcome.

Boltzmann machines algorithms: This Geoffrey Hinton algorithm is used for collaborative filtering, regression, classification, feature learning and topic modelling. This algorithm is mainly used for solving two different types of problems – search and learning problem. For the latter, Boltzmann machines make many small updates to their weights and each update needs to solve many different search problems. For the former, the weights on the connections are fixed and are used to represent the cost function of an optimization problem.

Hopfield Network: The Hopfield model, popularized by John Hopfield belongs is inspired by the associated memory properties of the human brain. According to UCLA website, the main purpose of the Hopfield network is to store one or more patterns and to recall the full patterns based on partial input. And all the nodes inside  the network function as inputs and outputs, and they are fully interconnected. That is, each node is an input to every other node in the network.

Kohonen Networks, other SOM/SOFM (self-organizing maps): Unlike other learning technique in neural networks, training a self-organizing map requires no target vector. And SOM learns to classify the training data without any external supervision. Developed by Teuvo Kohonen, “The SOM is a new, effective software tool for the visualization of high-dimensional data. It converts complex, nonlinear statistical relationships between high-dimensional data items into simple geometric relationships on a low-dimensional display. As it thereby compresses information while preserving the most important topological and metric relationships of the primary data items on the display, it may also be thought to produce some kind of abstractions.”

Neural Networks applications

A rendering of Google’s Deep Dream generator

Some of the most popular use cases can be seen in object recognition, speech to text, named entity recognition and now even in films. Its wider applications have been rolled out to the masses.

Google’s Deep Dream platform uses powerful AI algorithms to transform images: This computer vision program created by Google uses a convolutional neural network to find and enhance patterns in images via algorithmic pareidolia, the result is dream-like psychedelic images. According to Google, the software was invented to help scientists and engineers to discover what a deep neural network is seeing when it looks in a given image.

Deep Text at Facebook: This deep learning based text engine can decipher the meaning of several thousands of posts per second, that too of 20+ languages. According to Facebook, DeepText leverages several deep neural network architectures, including convolutional and recurrent neural nets, and can carry out word-level and character-level based learning. This engine draws on the ideas earlier developed and presented by by Ronan Collobert and Yann LeCun from Facebook AI Research in paper.

Download our Mobile App

Richa Bhatia
Richa Bhatia is a seasoned journalist with six-years experience in reportage and news coverage and has had stints at Times of India and The Indian Express. She is an avid reader, mum to a feisty two-year-old and loves writing about the next-gen technology that is shaping our world.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Upcoming Events

15th June | Online

Building LLM powered applications using LangChain

17th June | Online

Mastering LangChain: A Hands-on Workshop for Building Generative AI Applications

Jun 23, 2023 | Bangalore

MachineCon 2023 India

26th June | Online

Accelerating inference for every workload with TensorRT

MachineCon 2023 USA

Jul 21, 2023 | New York

Cypher 2023

Oct 11-13, 2023 | Bangalore

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox