Google at its recently held I/O conference announced that it had added ‘near me’ option to its ‘multisearch’ feature that was introduced in April, 20222. This feature lets users search using images and text through Google Lens.
According to Prabhakar Raghavan, senior vice president of Google Search, users will soon be able to add ‘Near me’ to an image or screenshot showing the products they’re looking for, and Google will send them to the closest location where they may find it.
The feature multisearch will be available in English later this year and will gradually spread to other languages.
Another important element of the multisearch feature is scene exploration, which allows users to pan their phone camera and learn about different items in the picture.
For instance, Raghavan explained that the functionality might be used to scan over a section of chocolates and narrow down the search to a certain type of nut-free chocolate.
Raghavan explained that scene exploration uses computer vision to instantly connect multiple threads that make up the scene and identify all the objects within it simultaneously.
According to Raghavan scene exploration will be a breakthrough in allowing machines to experience the world as humans do.
He also said that this technology might be utilized to address societal concerns such as assisting conservationists in identifying plant species that need protection or assisting disaster relief workers in efficiently sorting through donations in times of need.
Google CEO Sundar Pichai said that computers should be adapting to people, and not the other way around. He also said that Google is pushing to make computing more natural and intuitive with the google assistant.