Apple's Latest AI Innovations Unveiled at WWDC 2025

Comments · 113 Views

At WWDC 2025, Apple announced several AI features, such as a workout coach, live translation, and more.

Last year, Apple’s WWDC keynote highlighted the company’s ambitious strides in AI. This year, the company toned down its emphasis on Apple Intelligence and concentrated on updates to its operating systems, services, and software, introducing a new aesthetic it calls “Liquid Glass” along with a new naming convention.

Nevertheless, Apple still attempted to appease the crowd with a few AI-related announcements, such as an image analysis tool, a workout coach, a live translation feature, and more.

Visual Intelligence

Visual Intelligence is Apple’s AI-powered image analysis technology that allows you to gather information about your surroundings. For example, it can identify a plant in a garden, tell you about a restaurant, or recognize a jacket someone is wearing.

Now, the feature will be able to interact with the information on your iPhone’s screen. For instance, if you come across a post on a social media app, Visual Intelligence can conduct an image search related to what you see while browsing. The tool performs the search using Google Search, ChatGPT, and similar apps.

To access Visual Intelligence, open the Control Center or customize the Action button (the same button typically used to take a screenshot). The feature becomes available with iOS 26 when it launches later this year.

ChatGPT comes to Image Playground

Apple integrated ChatGPT into Image Playground, its AI-powered image generation tool. With ChatGPT, the app can now generate images in new styles, such as “anime,” “oil painting,” and “watercolor.” There will also be an option to send a prompt to ChatGPT to let it create additional images.

Workout Buddy

Apple’s latest AI-driven workout coach uses a text-to-speech model to deliver encouragement while you exercise, mimicking a personal trainer’s voice. The AI within the Workout app provides motivational talk during your run, highlighting key moments such as your fastest mile and average heart rate. After the workout, the AI summarizes your average pace, heart rate, and milestones.

Live Translation

Apple Intelligence powers a new live translation feature for Messages, FaceTime, and phone calls. This technology automatically translates text or spoken words into the user’s preferred language in real time. During FaceTime calls, users will see live captions, whereas for phone calls, Apple will translate the conversation aloud.

Apple has introduced two new AI-powered features for phone calls. The first is call screening, which automatically answers calls from unknown numbers in the background. The second feature, hold assist, automatically detects hold music when waiting for a call center agent.

Poll suggestions in Messages

Apple introduced a new feature that allows users to create polls within the Messages app. This feature uses Apple Intelligence to suggest polls based on the context of your conversations.

AI-powered shortcuts

The Shortcuts app is becoming more useful with Apple Intelligence. Users will be able to select an AI model to enable features like AI summarization when building a shortcut.

Contextually aware Spotlight

A minor update is being introduced to Spotlight, the on-device search feature for Mac. It will now incorporate Apple Intelligence to improve its contextual awareness, providing suggestions for actions that users typically perform, and tailored to their current tasks.

Foundation Models for developers

Apple is now allowing developers to access its AI models even when offline. The company introduced the Foundation Models framework, which enables developers to build more AI capabilities into their third-party apps that utilize Apple’s existing systems.

The most disappointing news to emerge from the event was that the much-anticipated developments for Siri aren’t ready yet. Attendees were eager for a glimpse of the promised AI-powered features that were expected to debut. However, Craig Federighi, Apple’s SVP of Software Engineering, said they won’t have more to share until next year. This delay may raise questions about Apple’s strategy for the voice assistant in an increasingly competitive market.



Source: TechCrunch

Comments