CEO Sundar Pichai announces LaMDA 2 at Google I/O conference

Conversation and natural language processing are powerful ways to make computers more accessible to everyone.

Alphabet CEO Sundar Pichai has announced LaMDA 2 at the Google I/O conference. Last year, the tech giant launched LaMDA built on Transformer – a neural network architecture open-sourced by Google Research in 2017. 

“We are continuing to advance our conversational capabilities. Conversation and natural language processing (NLP) are powerful ways to make computers more accessible to everyone. Large language models are key to this,” said Pichai.

He said more than thousands of its employees tested the second-generation LaMDA. The team has been able to reduce inaccurate or offensive responses in the new iteration.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

He said LaMDA 2 can talk about almost any topic such as Saturn’s ring, a planet made of ice cream, etc. Since the launch, LaMDA has added a lot of enhancements

“Staying on topic is a challenge for language models,” said Pichai. While demoing LaMDA 2 and AI Test Kitchen app, Google’s senior director of product management Josh Woodward said it can take a complex goal or topic and break it down into relevant sub-tasks and also gives tips, making the whole task feel a lot less daunting. 

Explaining further, Pichai said you could input all kinds of goals, whether moving to a new city or learning an instrument. “These experiences show the potential of language models to one day help us with things like planning, learning about the world, and more,” said Pichai. 

LaMDA still has a few teething troubles. “The model might still generate inaccurate, inappropriate or offensive responses. That’s why we are inviting feedback in the app so people can help report problems. We will be doing all of this work in accordance with our AI principles,” said Pichai at Google I/O 2022. 

He said the company is looking to open up LaMDA 2 over the coming months and get feedback from a broad range of stakeholders, including AI researchers, social scientists, and human rights experts, to improve the model. 

PaLM 

“To explore other areas of natural language processing (NLP) and AI, we recently announced a new Pathways language model or PaLM,” said Pichai. The model is trained on 540 billion parameters. He claimed the model demonstrated breakthrough performance on many NLP tasks including generating code from text, answering a math word problem, or even explaining a joke. 

He said when we combine the scale with a chain of thought prompting technique, it allows us to describe multi-step problems in a series of intermediate steps. The chain of thought increases accuracy by a large margin. “This leads to SOTA performance across several reasoning benchmarks,” Pichai added. PaLM offers a new approach that holds enormous promise for making knowledge more accessible for everyone.

More Great AIM Stories

Amit Raja Naik
Amit Raja Naik is a seasoned technology journalist who covers everything from data science to machine learning and artificial intelligence for Analytics India Magazine, where he examines the trends, challenges, ideas, and transformations across the industry.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM