Apple has significantly broadened its artificial intelligence capabilities with a new wave of Apple Intelligence features that will transform how users interact with their devices.
At the heart of the update is visual intelligence, which extends AI capabilities to content displayed on iPhone screens. This technology allows users to search and take action on anything they're viewing across apps. Users can ask ChatGPT questions about on-screen content, search for similar items using Google or Etsy, and highlight specific objects to find similar products online.
Visual intelligence also recognizes when users are looking at event information and can suggest adding it to their calendar, automatically extracting relevant details like date, time, and location. Users can access these features through the same buttons used for taking screenshots, giving them the option to save, share, or explore with visual intelligence.
For fitness enthusiasts, Apple Watch gains Workout Buddy, a first-of-its-kind experience powered by Apple Intelligence. This feature analyzes current workout data alongside fitness history to generate personalized, motivational insights in real time. Using metrics like heart rate, pace, distance, and Activity ring progress, a new text-to-speech model delivers these insights through a dynamic voice built from Fitness+ trainer recordings. Workout Buddy processes all data privately and securely, requiring Bluetooth headphones and a nearby Apple Intelligence-supported iPhone.
Developers haven't been left out either. The new Foundation Models framework joins a suite of tools allowing them to tap into on-device intelligence, while Xcode 26 leverages large language models like ChatGPT. These additions complement Apple's extensive set of over 250,000 APIs that enable integration with hardware and software features across machine learning, augmented reality, health and fitness, spatial computing, and high-performance graphics.
All these features are available for testing through the Apple Developer Program starting today, with a public beta coming next month through the Apple Beta Software Program. The full release is scheduled for this fall, supporting all iPhone 16 models, iPhone 15 Pro, iPhone 15 Pro Max, iPad mini (A17 Pro), and iPad and Mac models with M1 and later.
Initially, Apple Intelligence will support English, French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, and Chinese (simplified), with more languages coming by year-end: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese. Some features may have limited availability due to regional laws and regulations.
This expansion of AI capabilities across Apple's ecosystem represents the company's most ambitious push yet into artificial intelligence, bringing advanced features directly to hundreds of millions of users worldwide.