We've been honing our Google Search techniques for 20 years—whether it's using quotation marks, photos, or the asterisk wildcard. But sometimes you need a little more help explaining what you're looking for.
Enter multisearch. The feature, which relies on the Lens camera, uses text and pictures at the same time to "go beyond the search box and ask questions about what you see," according to Belinda Zeng, product manager for Google Search.
Love the wallpaper pattern at the coffee shop? Coveting your friend's outfit? Want to recreate a celebrity's nail art? Open the Google app on Android or iOS, tap the Lens camera icon, and search a screenshot or photo. To add more context, like a specific color or brand name, swipe up on visual matches and tap "Add to your search" to include text.
"Snap a photo of your dining set and add the query 'coffee table' to find a matching table," Zeng suggested. "Take a picture of your rosemary plant and add the query 'care instructions.'"
The beta feature—best used for shopping searches—is available now in English for US users.
"All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways," the company said in a blog post. Google is also exploring how to enhance the feature using its AI Search model MUM to improve results for "all the questions you could imagine asking."
Sign up for What's New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Your
Read more on pcmag.com