Alphabet CEO Pichai said Google has been working with various non-profit organisations to address the COVID crisis at the ongoing annual developer conference Google I/O 2021.
“I/O has always been a celebration of technology and its ability to improve lives and to remain optimistic that technology can help us address the challenges we face together,” said Pichai.
Pichai made many important announcements, including major releases and updates, in his keynote speech.
Sign up for your weekly dose of what's up in emerging technology.
Analytics India Magazine rounds up the key highlights and announcements related to AI/ML from the conference:
The next-gen language model
Pichai discussed how natural language processing and transformer models have made several breakthroughs in the past few years and how Google has been working tirelessly to make enhancements to organise and access heaps of information conveyed by the written and spoken word.
The tech giant unveiled LaMDA, a Language Model for Dialogue Applications. The model has been built on Transformer — a neural network architecture open-sourced by Google Research in 2017. “From concept all the way to design, we are making sure it is developed consistent with our AI principles. We believe LaMDA’s natural conversation capabilities have the potential to make information and computing radically more accessible and easier to use,” Pichai added.
AI in Google Maps
Google introduced two new features on Google Maps: Eco-friendly routes to provide the most fuel-efficient routes to the users; Safer Routing to get upfront information on weather and traffic conditions
AI-powered dermatology tool
Google AI has been working on detecting breast cancer, DNA sequencing, diabetic retinopathy, etc in healthcare. With the help of AI and machine learning, the tech giant has previewed an AI-powered dermatology tool. Using the same techniques that detect diabetic eye disease or lung cancer in CT scans, this tool helps to identify dermatologic issues, such as a rash on the arm, just using the phone’s camera.
Google Cloud’s new AI platform
Pichai introduced a new AI platform, Vertex AI. Vertex AI is Google Cloud’s new unified ML platform that brings AutoML and AI Platform together into a unified API, client library, and user interface. With the help of this platform, one can build, deploy, and scale ML models faster, with pre-trained and custom tooling within a unified AI platform. Vertex AI’s custom model tooling supports advanced ML coding, with nearly 80% fewer lines of code.
The “magic mirror” using AI
Project Starline combines advances in hardware and software to enable friends, families and coworkers to connect, even when they’re cities (or countries) apart.
The tech project combines computer vision, machine learning, spatial audio and real-time compression. Google has developed a breakthrough light field display system that creates a sense of volume and depth that can be experienced without the need for additional glasses or headsets.
The next-gen TPU
Pichai unveiled the next generation of Tensor Processing Units (TPUs), TPUv4. TPUv4 is powered by the v4 chip, which is more than twice as fast as TPUv3. A single v4 pod contains 4096 v4 chips and each pod has 10x interconnect bandwidth per chip at scale. The v4 includes 1 exaflop of computing power, and is the fastest system ever deployed at Google.
Quantum AI campus
The search giant has unveiled its new Quantum AI campus in Santa Barbara, California. This campus houses the first quantum data centre, quantum hardware research laboratories, and in-house quantum processor chip fabrication facilities.
A new milestone in AI algorithms
The tech giant introduced a new AI algorithm, known as the Multitask Unified Model, or MUM. Like BERT, MUM is built on a Transformer architecture, but it’s 1,000 times more powerful. It is trained across 75 different languages and many different tasks at once. MUM is multimodal and understands information across text and images and, in the future, can expand to video and audio.
New innovations in Google Photos
The tech giant unveiled a new feature in Google Photos called Little Patterns. Little Patterns uses machine learning to translate photos into numbers and then compare how visually and conceptually similar those images are.
The next evolution of collaboration
After launching Google Docs and Sheets 15 years ago, Smart canvas is the next big step to create an evolving hybrid work model that gives new urgency to existing collaboration challenges.
The tech giant also introduced new smart chips in Docs for recommended files and meetings. To insert smart chips into work, one must simply type “@” to see a list of recommended people, files, and meetings. Smart chips will come to Sheets in the coming months.