Google's "Multisearch" with AI support that combines text and images in a single query will become global

Among other Google announcements today (they told so many interesting things that the company's shares fell by as much as 7% at the moment) at a press conference in Paris focused on AI, the company announced that its new "multisearch" function will now be available to users worldwide on mobile devices, wherever Google is already available Lens.

Author: Isabel Green
10/01/23
For example, a user can find a photo of a shirt he likes in a Google search, and then ask Lens where they can find the same pattern, but on a different type of clothing, for example, on a skirt or socks. Or they can point their phone at a broken part on their bike and type in a Google search query like “how to fix it.” This combination of words and images can help Google process and understand search queries that it could not process previously, or that would be more difficult to type using text alone.

This method is most useful when searching for stores where you can find clothes that you like, but in different colors or styles. Or you can take a picture of a piece of furniture, such as a dining set, to find suitable items, such as a coffee table. In multisearch, users can also narrow and refine their results by brand, color and visual attributes, Google said in a statement.

This feature was made available to users in the US last October, and then expanded to India in December. To date, Google claims that multisearch is available to all users worldwide on mobile devices, in all languages and in countries where Lens is available.

The “multisearch near me" variation will also expand soon, Google announced today.

Last May, Google announced that a user would be able to make multiple search queries to local businesses (aka "multisearch near me"), which would return search results for products that users were looking for that matched inventory at local retail stores or other companies. For example, in the case of a bicycle with a broken part, you can add the text "next to me" to a search query with a photo to find a local bicycle or hardware store that had the spare part you needed.

As for the new search products, the giant talked about the upcoming Google Lens feature, noting that Android users will soon be able to search for what they see in photos and videos, apps and websites on their phone, while staying in the app or on the website. Google calls it “screen search" and says it will also be available wherever Lens is offered.

Google also shared a new milestone for Google Lens, noting that people now use this technology more than 10 billion times a month.

It will be necessary to test the feature. After all, I have a Pixel.