MUM: Thousand Times More Powerful Than BERT

Natural language understanding has made tremendous strides over the past decade. At the recent Google I/O event 2021, Prabhakar Raghavan, Senior Vice President at Google, unveiled a new AI technology that is 1000X powerful than BERT, known as Multitask Unified Model, or MUM.

“MUM is a thousand times more powerful than BERT. But what makes this technology groundbreaking is its ability to multitask in order to unlock information in new ways,” Raghavan said.

Sundar Pichai said in a blog post: “Translation, image recognition and voice recognition laid the foundation for complex models like LaMDA and multimodal models. LaMDA is a huge step forward in natural conversation, but it’s still only trained on text. When people communicate with each other, they do it across images, text, audio and video. So we need to build multimodal models (MUM) to allow people to naturally ask questions across different types of information. With MUM you could one day plan a road trip by asking Google to find a route with beautiful mountain views.”

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

Tech behind MUM

At present, search engines cannot deal with complex conversational and nuanced questions. However, with its language understanding capabilities, MUM is all set to change the game. In the case of search, MUM will provide valuable insights and suggest some additional pointers to go deeper into the topics. 

Like the popular BERT model, MUM is built on a Transformer architecture. Unlike most language models trained on one language, this new language model is trained across 75 different languages. This allowed the model to develop a more comprehensive understanding of information and knowledge than the previous models. 


Download our Mobile App



As the name suggests, MUM is multimodal. This means that the model understands information across different types of sources, such as text and images.

Features of MUM

The AI algorithm not only understands language but also generates it. Some of the critical features of MUM are mentioned below-

1| Helping with complex tasks: The AI algorithm can transform how Google helps you tackle complex tasks. Unlike the previous Transformer-based machine learning models, MUM includes a more comprehensive understanding of information and world knowledge.

2| Removing the language barriers: Language can be a significant barrier to accessing information. MUM has the potential to break down these boundaries by transferring knowledge across languages. 

3| Understanding information across types: MUM is multimodal, which means understanding information from different formats like web pages, pictures and more, simultaneously. 

4| Applying advanced AI to Search: In the coming years, the developers will bring MUM-powered features and improvements to various products. Google researchers have been performing the same processes they applied to test BERT and other language models to remove any pattern that may indicate bias in machine learning. 

Wrapping up

At present, MUM only understands information from sources like text and images. But in the coming years, the language model will expand to more modalities like video and audio. “ While we are in the early days of exploring this new technology, we are excited about its potential to solve more complex questions, no matter how you ask them,” said Raghavan.

Support independent technology journalism

Get exclusive, premium content, ads-free experience & more

Rs. 299/month

Subscribe now for a 7-day free trial

More Great AIM Stories

Ambika Choudhury
A Technical Journalist who loves writing about Machine Learning and Artificial Intelligence. A lover of music, writing and learning something out of the box.

AIM Upcoming Events

Early Bird Passes expire on 3rd Feb

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
AIM TOP STORIES

All you need to know about Graph Embeddings

Embeddings can be the subgroups of a group, similarly, in graph theory embedding of a graph can be considered as a representation of a graph on a surface, where points of that surface are made up of vertices and arcs are made up of edges