Google's annual developer conference showcased the company's continued commitment to AI innovation, with Project Astra and AI Mode emerging as the stars of I/O 2025.
Project Astra, first introduced last year as a research prototype, has evolved into a powerful universal AI assistant capable of understanding the world around users. Gemini Live now incorporates Project Astra's camera and screen-sharing capabilities, allowing users to engage with AI through their devices in more intuitive ways. These features are rolling out this week to all users on iOS and Android, enabling near-real-time verbal conversations with Gemini while streaming video from smartphone cameras or screens. In the coming weeks, Gemini Live will integrate more deeply with Google's ecosystem, offering directions from Maps, creating Calendar events, and making to-do lists with Tasks.
AI Mode, Google's AI-powered search feature, is now rolling out to everyone in the U.S. right on Search, with an opt-in available via Labs for immediate access. For users seeking more comprehensive responses, Google is introducing deep research capabilities into AI Mode through Deep Search. This powerful feature can issue hundreds of searches, reason across disparate pieces of information, and create expert-level fully-cited reports in minutes, potentially saving users hours of research.
Another significant announcement is the integration of Project Astra's capabilities into AI Mode through Search Live, coming this summer. This feature will allow users to have back-and-forth conversations with Search about what they see in real-time using their camera. When using AI Mode or Lens, users can click the "Live" button to ask questions about what they're seeing, with Project Astra streaming live video and audio into an AI model that responds with minimal latency.
Google is also bringing agentic capabilities from Project Mariner to AI Mode in Labs, starting with event tickets, restaurant reservations, and local appointments. In a demonstration, AI Mode swiftly found and reserved baseball game tickets, taking users directly to checkout within moments—showcasing how AI can handle complex, multi-step tasks effortlessly.
According to Google, these developments represent a new phase of the AI platform shift, where decades of research are now becoming reality for people, businesses, and communities worldwide. With Gemini 2.5 Pro and Flash models also receiving significant upgrades, Google's I/O 2025 announcements reflect the company's vision of creating more intuitive, capable, and personalized AI experiences across its product ecosystem.