Take a photo of an item and find where to buy it nearby, Google’s new bet

Google took advantage of Search On, its event-conference dedicated to the progress and new features of its search tools, to announce the upcoming features of its tools. Thanks to its more powerful artificial intelligence, our searches from tools like Search and Lens will soon be easier and more precise than ever.

Year after year, Google pushes the limits of its artificial intelligence and at the same time of its search engine.

Whether using keywords or from images on Google Lens, people ask the American search engine literally billions of questions per month.

As they say, “we better tie our toque with a brooch,” because his answers are about to be more specific than ever over the next year.

Only one photo needed to know where to shop

In 2017, Google introduced Lens, a feature that lets us search the web by taking photos.

We wonder the kind of such plant, the breed of such animal, the name of such bird, in short, we just have to take out our phone, take a picture of it and use Google Lens to have all the answers to our questions. .

Very soon, Google will add a layer of intelligence to Lens and equip it with a function called “Multisearch near me”.

An overview of the “Multisearch near me” function which will soon be launched in the United States.

Basically, if we are looking to buy a piece of clothing, a plant, find a restaurant that serves a particular type of dish, we will only have to take a picture of it and Google will tell us where to get it at close to where we are.

First launched in the United States, it still gives us a good idea of ​​the advances that are being prepared and which will soon see the light of day near us.

Green search results first

Google also took advantage of its conference event to announce that certain more ecological and sustainable search results would now automatically be put forward.

overview research functions green sustainable

Some sustainable and ecological choices will be put forward by Google during searches.

It mainly concerns two aspects that cause a lot of pollution and threaten the environment: fashion and food.

Basically, when we go to Google the next clothes to buy, it will highlight used and second-hand items first to encourage us to make more sustainable choices.

Then, when we search for a recipe, Google will highlight the ecological footprint related to the main ingredients of our meal.

Google thinks that having direct access to this information during a search will influence us to be more careful in our choices.

These new features will first be offered to those who do their Google searches in English, but as I mentioned above, it still gives us an excellent idea of ​​what awaits us in the coming months.

We would love to thank the author of this article for this remarkable material

Take a photo of an item and find where to buy it nearby, Google’s new bet

Discover our social media profiles as well as other pages that are related to them.https://www.ai-magazine.com/related-pages/