Google is updating its Lens app with a new multi-search feature that will allow people to search using images and text at the same time to guide the app.
The way it works is there will be a new ‘+ Add to your search’ button at the top of Google Lens that you can tap to bring up another search bar to narrow down results. You can condense results according to color, brand name, or any other visual attribute.
The multi-search feature works with both screenshots and photographs, and it even tells you where you can purchase the object in the picture. You may also find results on similar-looking objects or anything related to the item.
Powering this feature is Google’s new Multitask Unified Model, or MUM for short. It’s an AI tool that enhances Google’s search capability by using images alongside text in one query. Google also previewed MUM using videos in its searching, although this feature is still in its early stages.
This new feature is available today as part of a new Google Lens update on iOS and Android and is in beta, so it may not work as perfectly as it should initially. For example, it may bring up results for chocolate when you're searching for information on an oil filter because of similarities between boxes.
Multisearch will only be available to users in the United States and in English. Google has yet to say whether this feature will be going overseas or in other languages, although that's probably an eventuality.