Now Reading
MUM: Thousand Times More Powerful Than BERT

MUM: Thousand Times More Powerful Than BERT


Natural language understanding has made tremendous strides over the past decade. At the recent Google I/O event 2021, Prabhakar Raghavan, Senior Vice President at Google, unveiled a new AI technology that is 1000X powerful than BERT, known as Multitask Unified Model, or MUM.

“MUM is a thousand times more powerful than BERT. But what makes this technology groundbreaking is its ability to multitask in order to unlock information in new ways,” Raghavan said.

REGISTER FOR OUR UPCOMING ML WORKSHOP

Sundar Pichai said in a blog post: “Translation, image recognition and voice recognition laid the foundation for complex models like LaMDA and multimodal models. LaMDA is a huge step forward in natural conversation, but it’s still only trained on text. When people communicate with each other, they do it across images, text, audio and video. So we need to build multimodal models (MUM) to allow people to naturally ask questions across different types of information. With MUM you could one day plan a road trip by asking Google to find a route with beautiful mountain views.”

Tech behind MUM

At present, search engines cannot deal with complex conversational and nuanced questions. However, with its language understanding capabilities, MUM is all set to change the game. In the case of search, MUM will provide valuable insights and suggest some additional pointers to go deeper into the topics. 

Like the popular BERT model, MUM is built on a Transformer architecture. Unlike most language models trained on one language, this new language model is trained across 75 different languages. This allowed the model to develop a more comprehensive understanding of information and knowledge than the previous models. 

As the name suggests, MUM is multimodal. This means that the model understands information across different types of sources, such as text and images.

Features of MUM

The AI algorithm not only understands language but also generates it. Some of the critical features of MUM are mentioned below-

1| Helping with complex tasks: The AI algorithm can transform how Google helps you tackle complex tasks. Unlike the previous Transformer-based machine learning models, MUM includes a more comprehensive understanding of information and world knowledge.

See Also

2| Removing the language barriers: Language can be a significant barrier to accessing information. MUM has the potential to break down these boundaries by transferring knowledge across languages. 

3| Understanding information across types: MUM is multimodal, which means understanding information from different formats like web pages, pictures and more, simultaneously. 

4| Applying advanced AI to Search: In the coming years, the developers will bring MUM-powered features and improvements to various products. Google researchers have been performing the same processes they applied to test BERT and other language models to remove any pattern that may indicate bias in machine learning. 

Wrapping up

At present, MUM only understands information from sources like text and images. But in the coming years, the language model will expand to more modalities like video and audio. “ While we are in the early days of exploring this new technology, we are excited about its potential to solve more complex questions, no matter how you ask them,” said Raghavan.

What Do You Think?

Join Our Telegram Group. Be part of an engaging online community. Join Here.

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top