Active Hackathon

Why is AI pioneer Yoshua Bengio rooting for GFlowNets?

Developed in 2021, GFlowNets are a novel generative method for unnormalised probability distributions.

“I have rarely been as enthusiastic about a new research direction. We call them GFlowNets, for Generative Flow Networks,” wrote AI pioneer Yoshua Bengio in his latest blog. Developed in 2021, GFlowNets are a novel generative method for unnormalised probability distributions. They are based on flow networks, and the prerequisite is that the flow coming to a state must match the outgoing flow. They are also considered as a general framework for generative modelling of graphs and other discrete and composite objects.

This concept lies at the intersection of reinforcement learning, deep generative models and energy-based probabilistic models. One of the major advantages of GFlowNets, according to Bengio, is that it is easier for system 2 (fast and slow thinking) inductive biases, which in turn help with incorporating causality and in dealing with out of distribution generalisation. Other applications include non-parametric Bayesian modelling, unsupervised/semi-supervised learning of abstract representations, and generative active learning.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

GFlowNets: An alternative to MCMC

The objective of developing a model like GFlowNets was to sample the distribution of trajectories whose probability is proportional to positive return or reward function instead of generating a single highest reward sequence of actions. GFlowNets are well suited to model and sample from distributions from sets and graphs. They can be used to estimate free energies and marginal distributions. This is particularly useful for tasks where exploration is important. 

Credit: Yoshua Bengio

For reinforcement learning algorithms, a single highest reward sequence of actions is chosen for maximising the expected reward. However, for tasks like drug molecule synthesis, where exploration is important, the goal is to sample a diverse set of high return solutions that GFlowNets offer.

A flow network is a directed graph with sinks, sources and edges that receive the flow. The motivation of such a flow network is iterative black-box optimisation, where the agent has to calculate the reward for a large batch of candidates for each round. The team working on such a flow network defines a single source and sinks that are considered as the terminal states. The team then attempts to calculate a valid flow between nodes.

Talking about the GFlowNets, Bengio wrote, “Interestingly, this makes it possible to generate a diverse set of samples without facing what I used to think was the intractable challenge of mixing between modes with MCMC methods. What is remarkable with this framework is that it tells us how to train a policy that will sample the constructed objects with the desired probability and how to estimate the corresponding normalising constants and conditional probabilities over any subset of abstract variables.”

Read the full paper here.

GFlowNets for Bayesian structure learning

Bayesian networks are a popular framework choice for a range of applications like medical diagnosis, molecular biology, and ecological modelling. The Bayesian network structure is represented as a directed acyclic graph (DAG), and then calculating the statistical dependencies between the variables of interest is based on domain expert knowledge. However, in cases where the graph is unknown, the DAG structure of the Bayesian network can be known from the data alone to find the statistical relationship, forming the basis for novel scientific theories.

Recently, a group of researchers, including Bengio, proposed using GFlowNet as an alternative to MCMC for finding the posterior distribution over Bayesian networks for a given set of observations. It is seen as a sequential decision problem where the graph is constructed one edge at a time, depending on the learned transition probabilities. The researchers were able to demonstrate that their approach, also called DAG-GFlowNet, offers an alternative to MCMC or variational inference and provides an accurate approximation of posterior over DAGs. This paper also introduced improvements over the original GFlowNet framework like a novel flow-matching condition and corresponding loss function, a hierarchical probabilistic model for forward transitions, and using additional reinforcement learning-based tools.

The researchers predict that the structure of GFlowNets may eventually be adapted to work with alternative representations of statistical dependencies in Bayesian networks. They also aim to work towards its future applications in causal discovery, especially in learning model structure with latent variables.

Read the full paper here.

More Great AIM Stories

Shraddha Goled
I am a technology journalist with AIM. I write stories focused on the AI landscape in India and around the world with a special interest in analysing its long term impact on individuals and societies. Reach out to me at shraddha.goled@analyticsindiamag.com.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM