Can We Use Batch Normalisation For NLP

Batch normalisation (batch norm) is one of the most widely used deep learning techniques to stabilise and accelerate training in deep neural networks. This technique helps decrease the number of parameter updates required to achieve low training error. This reduction…

Is Google’s Claim To Patent Batch Normalization A Step Towards Monopolizing Algorithms? 

The year 2018 has seen a meteoric rise in the number of papers released in the field of AI. There were also numerous tools and techniques open sourced by the giants to carry the baton of AI research forward. Google’s…

Understanding Normalisation Methods In Deep Learning

Deep Learning models are creating state-of-the-art models on a number of complex tasks including speech recognition, computer vision, machine translation, among others. However, training deep learning models such as deep neural networks is a complex task as, during the training…

7 Deep Learning Methods Every AI Enthusiast Must Know

Deep Learning has seeped in almost every organisation and their day-to-day activities — right from the health sector to the music industry. This subset of machine learning is expected to reach $28.83 billion and expand at a CAGR of 48.4%…

More than 1,00,000 people are subscribed to our newsletter

Subscribe now to receive in-depth stories on AI & Machine Learning.