How DeepMind Powers Google Play Store Apps


Sign up for your weekly dose of what's up in emerging technology.

DeepMind has been powering data centres, revolutionising medical research and even saving battery life on phones. And the one thing that binds all these applications is the use of machine learning to optimise outcomes.

Ever since Google acquired DeepMind, it has been outsourcing its innovations to power its own products. Today, apps on the Play Store are also being customised using DeepMind’s research.

PlayStore Recommendations Unlike Any Other

via Google play store

Netflix will suggest movies based on the user’s watch history. If a user watches more action movies, the algorithm will be more inclined to suggest action movies. Youtube will recommend music based on the kind of genre on listens to the most.

Achieving real-time personalised content is the primary goal of all the platforms that use recommender systems. 

However, similar strategies might not work for Playstore, which also happens to be one of the largest deployers of recommender systems.

For instance, when a user installs, say, a travel booking app, the above strategies would recommend more travel apps. However, a better recommendation would be an app that would translate foreign languages since one is travelling.

To bring this level of intelligence on to the play store platform, DeepMind worked with the developers from Google and have developed solutions that would best suit the need of users.

DeepMind team considered the following three solutions:

Use LSTMs: a recurrent neural network that performs well in real-world scenarios, owing to a powerful update equation and backpropagation dynamics.

Replace LSTM with Transformer model: To address the computational and serving delays of LSTMs, transformer model was used.

Use additive attention model: Though transformer improved model performance, its high training cost led the researchers towards a more efficient solution.

They call this the candidate generator model.

Importance Weighting And Refined Re-ranking

via DeepMind blog

Today, Google Play’s recommendation system contains three main models: a candidate generator, a re-ranker, and a model to optimise for multiple objectives. The candidate generator is a deep retrieval model that can analyse more than a million apps and retrieve the most suitable ones. 

To address the bias associated with the candidate generator model, DeepMind introduced ‘importance weighting’ in their model.

Importance weighting considers the following two metrics:

  • Impression-to-install rate of each individual app 
  • Median impression-to-install rate across the Play store.

An app is compared using these two metrics and the app with a below-median install rate will have an important weight less than one. In this way, an app is either wp-weighted or down-weighted and reduces the chances of bias in the recommendation.

Since the problem of bias is handled, the ranking has to be refined. This ranking decides what appears on the top of the screen without the user having to scroll.

In conventional recommendations systems, ranking is treated as a binary classification problem. This pointwise ranking ignores the association between apps. Like the one discussed above— travel booking app followed by a translator app.

To take into account the relation between apps and refine the ranking, DeepMind has built a re-ranker model. Unlike the conventional pointwise model, a pairwise model is deployed. And, they take care of all these online in real-time.

The Hand That Keeps Giving

Apart from their success with AlphaGo game and research with protein folds, their research has been directly been put into use by Google.

Here is a list of their top contributions to Google:

  • reduced the electricity needed for cooling Google’s data centres by up to 30%.
  • boosted the value of Google’s wind energy by roughly 20%
  • created on-device learning systems to optimise Android battery performance.
  • WaveNet is used for Google Assistant and Google Cloud Platform.
  • Helped Waymo by improving the efficiency of training its neural networks.

“Looking at the pace of progress, I think we will have AI in a form in which it benefits a lot of users in the coming years, but I still think it’s early days, and there’s a long-term investment for us,” said Sundar Pichai when he was asked about DeepMind back in 2016.

Despite the news of DeepMind’s losses doing rounds, the team has proven time and again that it has been a great addition to Google. 

More Great AIM Stories

Ram Sagar
I have a master's degree in Robotics and I write about machine learning advancements.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM