Google have unveiled a suite of AI-driven enhancements to its Search and Lens features, marking a significant leap in how users interact with and discover information online.
Liz Reid, VP and Head of Google Search, announced that the company's Gemini AI model is now powering advanced search experiences, including AI Overviews and improved Google Lens functionality. These updates are designed to expand the ways users can explore the world around them and satisfy their curiosities.
One of the most notable additions is video understanding in Google Lens. Users can now record video while asking questions about moving objects in real-time. The AI system processes both the video and the question to generate an AI Overview with relevant information and web resources. This feature is currently available globally in the Google app for Search Labs users enrolled in the "AI Overviews and more" experiment, supporting English queries.
Voice input has also been integrated into Lens, allowing users to ask questions verbally while taking photos. The feature is now accessible worldwide in the Google app for both Android and iOS devices, with support for English queries.
Shopping experiences through Lens have been enhanced as well. When users photograph a product, Lens now provides a comprehensive results page with key information, including reviews, price comparisons across retailers, and purchase options. This update leverages Google's Shopping Graph, which contains information on more than 45 billion products.
Search result pages are also getting an AI-powered makeover. Starting in the U.S., Google is rolling out AI-organised search results for recipes and meal inspiration on mobile devices. This new full-page experience presents diverse content formats, including articles, videos, and forums, all in one place. Early testing has shown that users find these AI-organised results more helpful and diverse.
To improve connections to web content, Google has redesigned AI Overviews to include prominent links to supporting webpages directly within the text. This change has led to increased traffic to these sites and improved user experience in accessing interesting content.
The company is also expanding its Circle to Search feature, now available on over 150 million Android devices. This tool allows users to identify songs they hear across various apps without switching contexts.
In a move to enhance advertising relevance, Google has begun testing ads in AI Overviews for applicable queries in the U.S. This initiative aims to connect users more efficiently with relevant businesses, products, and services.
These updates reflect Google's commitment to reimagining search capabilities through AI. With Lens queries becoming one of the fastest-growing query types and nearly 20 billion visual searches performed monthly, the company continues to push the boundaries of how users can interact with Search, whether through text, audio, voice, or images.