Tim Cook and Craig Federighi at Apple WWDC 2025.
After the WWDC 2025 keynote, it's official -- Apple is going in a different direction on AI than the rest of the industry.
There were no outlandish promises or predicted breakthroughs, unlike last year's moonshots on "personal context" and a reimagined version of Siri. And there was very little mention of chatting with bots at all.
CNET survey: Just 11% of people upgrade their phone for AI features. Here's what they want instead
Instead, Apple is taking the less speculative and more established strengths of large language models (LLMs) and integrating them piece-by-piece into the iPhone and other devices -- often without the need to even mention the word AI.
First and foremost is Apple's Live Translation feature. Language translation is one of the things that LLMs do really well. In most cases, you have to copy and paste into a chatbot or use a language app like Google Translate to take advantage of those superpowers. In iOS 26, Apple is integrating its Live Translation features directly into the Messages, FaceTime, and Phone apps so that you can use the feature in the places where you're having conversations.
PCMag: Apple's WWDC Was All Glass, No Vision: What's on Tap to Counter OpenAI, Google?
Next, there's Visual Intelligence. Apple will now let you use it from any app or screen on your phone by integrating it directly into the screenshot interface. With iOS 26, Visual Intelligence can now recognize what's on your screen, understand the context, and recommend actions. The example that was shown in the keynote was an event flyer where you take a screenshot and Visual Intelligence automatically creates a calendar event for it.
This is actually a step toward an AI agent, one of the most popular -- and sometimes overhyped -- tech trends of 2025. I'm looking forward to trying this feature and seeing what else it can do. I've had good luck using the Samsung/Google version of a feature like this called Circle-to-Search. Another new thing Visual Intelligence will let you do in iOS 26 is ask ChatGPT questions about what you've captured on your screen. Visual Intelligence can also take the text of what you've captured in your screenshot and read it aloud to you or summarize it.
Another one of the excellent LLM capabilities that's been enhanced this year can be seen in Shortcuts, which can now tap into the Apple Intelligence models. For example, you can create a Shortcut that would take any file you save to the Desktop on MacOS 26, use Apple Intelligence to examine the contents of the file (while preserving privacy), and then categorize it and move it into one of several different folders that you've named based on categories of stuff you do. You can even automate this to happen every time you save a file to the desktop, which again makes this more like an AI agent.
Also: Apple's Goldilocks approach to AI at WWDC is a winner. Here's why
One more way that Apple is tapping into LLMs can be seen in the new functionality in the Share button in iOS 26. For example, you can now take a list of things from a PDF or a web page in Safari, select the text, tap the Share button, and then select the Reminders app. Apple Intelligence will use its generative models to analyze the list and turn them each into to-do items in the category you choose in the Reminders app. If it's a long list, then you can even have the AI break it into subcategories for you, again using LLM's natural language processing (NLP) capabilities.
(Disclosure: Ziff Davis, 's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
Lastly, while most of the leading players in generative AI typically offer both a chatbot for the general public and a coding companion for software developers -- since those are two of the things that LLMs are best known for -- Apple didn't say much about building either at WWDC 2025. With thousands of developers on the campus of Apple Park, it might have seemed like the perfect time to talk about both.
Also: The 7 best AI features announced at Apple's WWDC that I can't wait to use
But all Apple would say about the next version of Siri -- which has long been in need of a re-think -- was that it's still working on it and that it won't release the next Siri until it meets the company's high standards for user experience.
And when it comes to programming companions, Apple did not unveil its own coding copilot for developer tools like Swift and Xcode -- after promising Swift Assist at last year's WWDC. Instead, Apple made a couple of big moves to empower developers. It opened up its own Foundation Models framework to allow developers to tap into the powers of Apple Intelligence -- with as little as three lines of code in Swift, Apple claims. Plus, it all happens on-device and at no cost.
Also: How Apple just changed the developer world with this one AI announcement
And in Xcode 26, Apple will now allow developers to use the generative coding companion of their choice. ChatGPT is integrated by default in Xcode 26 but developers can also use their API keys from other providers to bring those models into Xcode. Apple sees this as a fast-moving space and wants developers to have access to the latest tools of their choice, rather than limiting developers to only stuff built by Apple.
All in all, Apple is making a lot of pragmatic choices when it comes to AI, leaning into the things that LLMs do best, and simply using generative AI to make better features on its phones, laptops, and other devices.
Keep up with all the latest AI developments from Apple and the rest of the AI ecosystem by subscribing to 's free Tech Today newsletter.