MITB Banner

How DeepMind Powers Google Play Store Apps

Share

DeepMind has been powering data centres, revolutionising medical research and even saving battery life on phones. And the one thing that binds all these applications is the use of machine learning to optimise outcomes.

Ever since Google acquired DeepMind, it has been outsourcing its innovations to power its own products. Today, apps on the Play Store are also being customised using DeepMind’s research.

PlayStore Recommendations Unlike Any Other

via Google play store

Netflix will suggest movies based on the user’s watch history. If a user watches more action movies, the algorithm will be more inclined to suggest action movies. Youtube will recommend music based on the kind of genre on listens to the most.

Achieving real-time personalised content is the primary goal of all the platforms that use recommender systems. 

However, similar strategies might not work for Playstore, which also happens to be one of the largest deployers of recommender systems.

For instance, when a user installs, say, a travel booking app, the above strategies would recommend more travel apps. However, a better recommendation would be an app that would translate foreign languages since one is travelling.

To bring this level of intelligence on to the play store platform, DeepMind worked with the developers from Google and have developed solutions that would best suit the need of users.

DeepMind team considered the following three solutions:

Use LSTMs: a recurrent neural network that performs well in real-world scenarios, owing to a powerful update equation and backpropagation dynamics.

Replace LSTM with Transformer model: To address the computational and serving delays of LSTMs, transformer model was used.

Use additive attention model: Though transformer improved model performance, its high training cost led the researchers towards a more efficient solution.

They call this the candidate generator model.

Importance Weighting And Refined Re-ranking

via DeepMind blog

Today, Google Play’s recommendation system contains three main models: a candidate generator, a re-ranker, and a model to optimise for multiple objectives. The candidate generator is a deep retrieval model that can analyse more than a million apps and retrieve the most suitable ones. 

To address the bias associated with the candidate generator model, DeepMind introduced ‘importance weighting’ in their model.

Importance weighting considers the following two metrics:

  • Impression-to-install rate of each individual app 
  • Median impression-to-install rate across the Play store.

An app is compared using these two metrics and the app with a below-median install rate will have an important weight less than one. In this way, an app is either wp-weighted or down-weighted and reduces the chances of bias in the recommendation.

Since the problem of bias is handled, the ranking has to be refined. This ranking decides what appears on the top of the screen without the user having to scroll.

In conventional recommendations systems, ranking is treated as a binary classification problem. This pointwise ranking ignores the association between apps. Like the one discussed above— travel booking app followed by a translator app.

To take into account the relation between apps and refine the ranking, DeepMind has built a re-ranker model. Unlike the conventional pointwise model, a pairwise model is deployed. And, they take care of all these online in real-time.

The Hand That Keeps Giving

Apart from their success with AlphaGo game and research with protein folds, their research has been directly been put into use by Google.

Here is a list of their top contributions to Google:

  • reduced the electricity needed for cooling Google’s data centres by up to 30%.
  • boosted the value of Google’s wind energy by roughly 20%
  • created on-device learning systems to optimise Android battery performance.
  • WaveNet is used for Google Assistant and Google Cloud Platform.
  • Helped Waymo by improving the efficiency of training its neural networks.

“Looking at the pace of progress, I think we will have AI in a form in which it benefits a lot of users in the coming years, but I still think it’s early days, and there’s a long-term investment for us,” said Sundar Pichai when he was asked about DeepMind back in 2016.

Despite the news of DeepMind’s losses doing rounds, the team has proven time and again that it has been a great addition to Google. 

Share
Picture of Ram Sagar

Ram Sagar

I have a master's degree in Robotics and I write about machine learning advancements.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.