Google has introduced a new feature to its AI Mode search experience, enabling users to view financial data through interactive charts and graphs. The feature is currently available via Google Labs in the United States as part of a limited preview and is designed to make data comprehension easier, especially for stock market and mutual fund-related queries.
AI Mode now supports smart data visualisation
In a blog post, the Mountain View-based tech giant explained that the AI-enhanced feature can automatically convert data into interactive visual formats. By understanding context and data patterns, the AI generates charts when users ask stock-related or financial questions, offering a more intuitive way to analyze information.
For example, if a user searches “compare the stock performance of blue chip CPG companies in 2024,” AI Mode leverages Google’s Gemini model to produce a comparative chart that visualizes each company’s performance over time. The output also includes a descriptive analysis, making it easier for users to understand financial trends.
Currently available via Google Labs in the US
As of now, this AI-powered charting tool is only accessible through Google Labs, where users can activate it manually. Google says the goal is to provide intelligent responses with visual context, especially when analyzing complex data over time.
It remains unclear whether users can request specific chart types directly or if the AI auto-generates visuals based on relevance. The feature currently appears focused on financial topics, and it is unknown whether broader data categories will be supported in the future.
Google expanding AI Mode capabilities
This new charting function is part of Google’s broader push to make search more interactive and multimodal. A recent update also introduced Search Live, a feature similar to Gemini Live, that allows users to interact with AI Mode hands-free. That feature is also being rolled out in phases to select users in the US.
Google’s AI Mode updates reflect an ongoing shift toward AI-driven, visual-first search experiences, transforming how users interact with complex datasets in real time.
 
 
          