More from WWDC 2025: all the news from Apple’s annual developer conference
Apple is pushing visual intelligence features that build on Apple Intelligence, allowing users to go beyond searching for context using their device’s camera and now also “search and take action on anything they’re viewing across their apps.” The feature can also recognize when a user is looking at something they may like to attend and pre-populate a calendar event with time and place.
“Users can ask ChatGPT questions about what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products,” according to Apple. To access the feature, a user can act like they’re taking a screenshot — they will then be prompted to either save the screenshot or search using Apple Intelligence.
Apple is updating Genmoji and Image Playground with new styles, powered in part by ChatGPT. With Image Playground, users can now tap into ChatGPT to change a friend’s photo into the style of an oil painting, for instance.
Image Playground sends the description you write out, or your image, to ChatGPT to create the results, but “nothing is shared with ChatGPT without your permission,” according to Apple.
Apple announced at WWDC 2025 that it’s debuting live translation Apple Intelligence features that allow you to translate between languages in Messages, FaceTime, and phone calls.
For a phone call, that’ll mean live translation aloud as you talk, and with FaceTime, it means live captions displayed on the screen. It’s all powered via Apple-built models that run on-device.
There’s a bunch of good new stuff in group messaging for Apple Messages — polls, better notification management for unknown senders, and more — but the best quality-of-life thing here is definitely typing indicators for group chats. This is going to make chaotic family messaging so much less chaotic.
This year at WWDC, Apple announced it’s opening up access for any app to tap into the on-device large language model at the core of Apple Intelligence, giving developers direct access. It will “ignite a whole new wave of intelligence experiences” in the apps users frequent, per Apple, and it cuts out cloud API costs due to on-device access.







Software redesigns, some new names, and probably more AI. Probably.
I feel like there are reports every year that Apple is going to introduce better iPad multitasking, and then we get something like Stage Manager or a different update that’s still not quite what I need to work. This year’s rumor suggests more Mac-like features. I’ve been playing with a Surface Pro 12 I bought, and I keep thinking, ’Man, it would be so nice to be able to do this on my M4 iPad Pro.’
Right before WWDC 2025, Apple researchers published a paper called The Illusion of Thinking (PDF) that made waves. The researchers wrote that popular and buzzy AI models “face a complete accuracy collapse beyond certain complexities,” especially with things they’ve never seen before.
They presented models from OpenAI, Anthropic, and DeepSeek with new and complex puzzle games and found their reasoning ability “increases with problem complexity up to a point, then declines.”































