# What Are Normalising Flows And Why Should We Care

Machine learning developers and researchers are constantly in pursuit of finding a well-defined probabilistic model that would correctly describe the processes that produce data. A central need in all of the machine learning is to develop the tools and theories to develop better-specified models that lead to even better insights of data.

#### THE BELAMY

##### Sign up for your weekly dose of what's up in emerging technology.

One such attempt has been made by Danilo Rezende in the form of normalising flows. Today building probability distributions as the normalising flow is an active area of ML research.

Normalizing flows operate by pushing an initial density through a series of transformations to produce a richer, more multimodal distribution — like a fluid flowing through a set of tubes. Flows can be used for joint generative and predictive modelling by using them as the core component of a hybrid model.

### Significance Of Normalised Flows

Normalizing flows provide a general way of constructing flexible probability distributions over continuous random variables.

Let x be a D-dimensional real vector, and suppose we would like to define a joint distribution over x. The main idea of flow-based modelling is to express x as a transformation T of a real vector u sampled from a distribution of the flow-based model.

According to the Google Brain team, the key idea behind normalising of flows can be summarised as follows:

• Take some distribution X whose log p(x) we can compute easily.
• Learn some function f(x) where sampling: y = f(x)
• Learn its inverse f-1(y) to transform points in Y back to the domain of X.
• Density evaluation log p(y) = log p(x) + |log det J(f-1)(y)|, which can be optimised via stochastic gradient descent methods.

The flow can be thought of as an architecture, where the last layer is a (generalised) linear model operating on the features and these features distribution can be viewed as a regulariser on the feature space. In turn, flows are effective in any application requiring a probabilistic model with either of those capabilities.

Normalizing flows, due to their ability to be expressive while still allowing for exact likelihood calculations, are often used for probabilistic modelling of data. They have two primitive operations: density calculation and sampling.

For example, invertible ResNets have been explored for classification with residual flows and have witnessed a first big improvement. The improvement can be something as significant as the reduction of the model’s memory footprint by obviating the need to store activations for backpropagation.

This achievement may help one understand to what degree discarding information is crucial to deep learning’s success.

Normalizing flows allow us to control the complexity of the posterior at run-time by simply increasing the flow length of the sequence.

Rippel and Adams (2013), were the first to recognise that parameterizing flows with deep neural networks could result in quite general and expressive distribution classes.

Like with deep neural networks, normalizing the intermediate representations is crucial for maintaining stable gradients throughout the flow.

Normalizing flows can also be integrated into traditional Markov chain Monte Carlo (MCMC) sampling by using the flow to reparameterize the target distribution. Since the efficiency of Monte Carlo methods drastically depends on the target distribution, normalizing flows would make it easier to explore.

Normalizing flows can be thought of as implementing a ‘generalised reparameterization trick’, as they leverage a transformation of a fixed distribution to draw samples from a distribution of interest.

For instance, the Generative Model has been a popular application of flows in machine learning. Here are some other examples:

• Image generation has been given serious effort since the earliest work on flows. Dinh et al. (2017) increased the capacity of their model by including scale transformations (instead of just translations), being the first to demonstrate that flows could produce sharp, visually compelling full-colour images.
• In the case of text, the most direct way to apply normalizing flows to text data is to define a discrete flow over characters or a vocabulary.

### Future Direction

In a paper titled, Normalizing Flows for Probabilistic Modeling and Inference, researchers from DeepMind investigated the state of flow models in detail.

They have listed the kind of flow models that have been in use, their evolution and their significance in domains like reinforcement learning, imitation learning, image, audio, text classification and many more.

The authors also speculate that many flow designs and specific implementations will inevitably become out-of-date as work on normalizing flows continues, we have attempted to isolate foundational ideas that will continue to guide the field well into the future.

The large scale adoption of normalising flows in place of conventional probabilistic models is advantageous because unlike other probabilistic models that require approximate inference as they scale, flows usually admit analytical calculations and exact sampling even in high dimensions.

However, the obstacles that are currently preventing wider application of normalizing flows are similar to those faced by any probabilistic models. With the way research is accelerating, the team at DeepMind are optimistic about the future of flow models.

## More Great AIM Stories

### Python Guide To Google’s T5 Transformer For Text Summarizer

I have a master's degree in Robotics and I write about machine learning advancements.

## Our Upcoming Events

Masterclass, Virtual
How to achieve real-time AI inference on your CPU
7th Jul

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, Virtual
Deep Learning DevCon 2022
29th Oct

### Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

### Telegram Channel

Discover special offers, top stories, upcoming events, and more.

### Analytics India Industry Study 2022

The analytics industry recorded a substantial increase of 34.5% on a year-on-year basis in 2022, with the market value reaching USD 61.1 billion.

### Double debiased machine learning for evaluation and inference

Debiased ML combines bias correction and sample splitting to compute scalar summaries.

### MS Word is playing catch up with Google Docs

Microsoft launched MS Word Online in 2020.

### From Gynaecology to Data Science: The journey of Dr Nitin Paranjape

Even experts in this field spend close to 80 percent of their time cleaning up data, which is an absolute waste of humanity.

### How to select the best set of features using SVM?

Do you want to know how to select best set of features using SVM.Here is a complete implementation.

### Refuelling India with bigger, better convenience store: 7-Eleven

7-Eleven has over 81,000 stores in 18 countries and regions. In India, the company is working with master franchisee Reliance Retail to modernise the small-retail environment and bring greater convenience to shoppers.

### How AI is used for data access governance

Blast radius is a way of measuring the total impact of a potential security breach.

### Non-profits in AI innovation that we need more of

Cohere says that Cohere For AI will work to solve some of the industry’s toughest challenges by contributing “fundamental research” to the open-source community.

### Gradient Ascent: When to use it in machine learning?

Gradient ascent maximizes the loss function of the algorithm

### Will Google drop TensorFlow?

Meta AI released PyTorch, an open-source machine learning platform, in 2016.

[class^="wpforms-"]
[class^="wpforms-"]