Now Reading
DeepMind Leverages Nash Equilibrium To Tackle Fundamental ML Problems

DeepMind Leverages Nash Equilibrium To Tackle Fundamental ML Problems

  • DeepMind has introduced an approach modeled on game theory to help solve fundamental machine learning problems.

The most common method to teach AI systems to perform tasks is training on examples. The process is continued until the system is thoroughly trained and mistakes are minimised. However, it is a solitary endeavour.

Humans learn from interactions. Scientists have found the same applies to machines as well. AI Research Lab DeepMind has previously trained AI agents to Capture the Flag and achieve the Grandmaster level at Starcraft. Taking inspiration from these experiments, DeepMind has introduced an approach modeled on game theory to help solve fundamental machine learning problems.

Deep Learning DevCon 2021 | 23-24th Sep | Register>>

The principal component analysis (PCA) is a dimensionality reduction technique to make large data sets smaller without losing most of the original information. For this research, the DeepMind team reformulated a competitive multi-agent game called EigenGame.

Principal component analysis

PCA burst into the scene in the early 1900s and has been a long-standing technique to process high dimensional data. Over the years, the technique has become the first step in the data processing pipeline to cluster and visualise data. It is helpful for regression and classification tasks on low-dimensional representations.

Even after a century later, the technique remains relevant and has emerged as an important field of research for mainly two main reasons:

Looking for a job change? Let us help you.
  • With the increase in the amount of data available, PCA became a computational bottleneck. To improve how PCA scales, researchers have used randomised algorithms to fully harness deep learning-centric advances in computation. However, the research is still on for better optimisation.
  • Since PCA shares common solutions with several machine learning and engineering problems, it has become an important field of research for developing insights and algorithms that apply broadly across the branches of ML tree.


DeepMind recently presented a new multi-agent perspective to PCA (it is traditionally a single-agent problem) that offers a way to scale massive data sets that previously were computationally demanding. Presented at the ICLR 2021, this approach was outlined in a paper titled “EigenGame: PCA as a Nash Equilibrium”.

In this approach, the DeepMind team used eigenvectors to design the game. Eigenvectors capture the critical variance in the data and are orthogonal to each other. In EigenGame, each player controls an eigenvector. To gain points, players need to explain variance within the data. On the flip side, a player will be penalised if they are too closely aligned with other players. It means that while Player 1 maximises its variance, other players need to be vigilant of minimising their alignment with players above them in the hierarchy. The combination of rewards and penalties ultimately defines a player’s utility.

With properly designed variance and alignment terms determined in EigenGame, the researchers were able to show:

  •  If all players play optimally, together they can achieve Nash equilibrium of the game, which is the PCA solution. Nash equilibrium is a decision-making theorem in game theory, named after mathematician John Forbes Nash Jr. It states that a player is assumed to know the equilibrium strategy of other players. No player gains anything by changing only their strategy.
  • The PCA solution can be arrived at if each player uses gradient ascent to maximise their utility independently and simultaneously.

The independence property of the gradient ascent is significant because it allows computation to be distributed across several Google Cloud TPUs. This enables data and model parallelism, making it possible for the algorithm to adapt to truly large-scale data. With EigenGame, researchers were able to find principal components for hundred-terabyte datasets containing ‘millions of features or billions of rows’ in just a matter of a few hours. 

What Do You Think?

Join Our Discord Server. Be part of an engaging online community. Join Here.

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top