Google adds ‘near me’ to multisearch, drops hotword from voice assistant

Scene exploration taps into the richness of the web and Google's Knowledge Graph to surface the most helpful results.

Advertisement

At the Google I/O conference, the tech giant announced a new addition to its ‘multisearch’ feature introduced last April. The feature allows users to search using both text and images. Prabhakar Raghavan, senior vice president at Google Search, said users can soon add ‘Near me’ to a picture or screenshot containing the product they are looking for, and Google will direct them to the nearest place where they can find it. Multisearch will be available later this year in English and will expand to other languages over time. 

Source: Google

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

Scene exploration is another key component of multisearch feature, enabling users to pan their phone camera and learn about multiple objects within the scene. For example, Raghavan said, the feature could be used to scan through a section with all kinds of chocolates and narrow down the search to look for a specific type of nut-free chocolate. 

“Scene exploration uses computer vision to instantly connect multiple threads that make up the scene and identify all the objects within it simultaneously. It taps into the richness of the web and Google’s Knowledge Graph to surface the most helpful results,” he said. According to Raghavan, scene exploration will be a breakthrough in helping our devices see the world as we do. 

“Looking further out, this technology could be used beyond everyday needs to help address societal challenges, like supporting conservationists in identifying plant species that need protection or helping disaster relief workers quickly sort through donations in times of need,” he added. 

Look and talk

“Computers should be adapting to people, and not the other way around” said Sundar Pichai. Google is pushing to make computing more natural and intuitive with the google assistant. Sissie Hsiao, vice president at google assistant announced a fresh update called ‘Look and Talk.’ The feature will allow users to simply look at the display and ask a question without having to preface it with a hotword like ‘Hey Google’. The feature will be available first on nest hub max in the US.

Hsiao said the company’s quick phrases feature will also be available to nest hub max to help users make common requests with ease. Quick phrases are a set of commands like setting the alarm, asking time and temperature, dimming the lights etc.

Google’s voice assistant will also be able to speak more naturally and understand a natural manner of speech which usually includes pauses or umms in the middle. The assistant will now be able to determine the context of the words and offer a prompt in case the user cannot remember the word. 

More Great AIM Stories

Poulomi Chatterjee
Poulomi is a Technology Journalist with Analytics India Magazine. Her fascination with tech and eagerness to dive into new areas led her to the dynamic world of AI and data analytics.

Our Upcoming Events

Conference, in-person (Bangalore)
MachineCon 2022
24th Jun

Conference, Virtual
Deep Learning DevCon 2022
30th Jul

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM
MORE FROM AIM
Amit Raja Naik
Oh boy, is JP Morgan wrong?

The global brokerage firm has downgraded Tata Consultancy Services, HCL Technology, Wipro, and L&T Technology to ‘underweight’ from ‘neutral’ and slashed its target price by 15-21 per cent.