2 – Breaking News & Latest Updates 2026
Skip to main content

More from WWDC 2025: all the news from Apple’s annual developer conference

Hayden Field
Hayden Field
Apple’s ChatGPT integration makes it easier to search for more context on images and shop for things they see.

Apple is pushing visual intelligence features that build on Apple Intelligence, allowing users to go beyond searching for context using their device’s camera and now also “search and take action on anything they’re viewing across their apps.” The feature can also recognize when a user is looking at something they may like to attend and pre-populate a calendar event with time and place.

“Users can ask ChatGPT questions about what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products,” according to Apple. To access the feature, a user can act like they’re taking a screenshot — they will then be prompted to either save the screenshot or search using Apple Intelligence.

Image: Apple
Hayden Field
Hayden Field
Apple’s Image Playground now integrates with ChatGPT.

Apple is updating Genmoji and Image Playground with new styles, powered in part by ChatGPT. With Image Playground, users can now tap into ChatGPT to change a friend’s photo into the style of an oil painting, for instance.

Image Playground sends the description you write out, or your image, to ChatGPT to create the results, but “nothing is shared with ChatGPT without your permission,” according to Apple.

A simulated screenshot of Image Playground on a MacBook restyling a woman’s selfie to look like a drawing.
New styles in Image Playground
Image: Apple
Hayden Field
Hayden Field
Apple Intelligence takes on language barriers in messages and phone calls.

Apple announced at WWDC 2025 that it’s debuting live translation Apple Intelligence features that allow you to translate between languages in Messages, FaceTime, and phone calls.

For a phone call, that’ll mean live translation aloud as you talk, and with FaceTime, it means live captions displayed on the screen. It’s all powered via Apple-built models that run on-device.

Simulated iPhone screenshots showing Live Translation for text messages, FaceTime chars, and voice calls.
Live Translation
Image: Apple
David Pierce
David Pierce
Typing indicators in group chats!

There’s a bunch of good new stuff in group messaging for Apple Messages — polls, better notification management for unknown senders, and more — but the best quality-of-life thing here is definitely typing indicators for group chats. This is going to make chaotic family messaging so much less chaotic.

Hayden Field
Hayden Field
Apple gives developers access to its on-device Apple Intelligence model.

This year at WWDC, Apple announced it’s opening up access for any app to tap into the on-device large language model at the core of Apple Intelligence, giving developers direct access. It will “ignite a whole new wave of intelligence experiences” in the apps users frequent, per Apple, and it cuts out cloud API costs due to on-device access.

Simulated iPhone animation showing AllTrails AI chatbot features powered by Apple Intelligence.
AllTrails using Apple Intelligence
Image: Apple
Apple’s AirPods update adds camera controls and moreApple’s AirPods update adds camera controls and more
Emma Roth and Cameron Faulkner
Apple renames its operating systemsApple renames its operating systems
Dominic Preston
WWDC 2025 live blog: a fresh new look at iOS, macOS, and more

Software redesigns, some new names, and probably more AI. Probably.

Victoria Song, Alex Heath and 2 more
Todd Haselton
Todd Haselton
I really, really hope Apple announces way better iPad multitasking today.

I feel like there are reports every year that Apple is going to introduce better iPad multitasking, and then we get something like Stage Manager or a different update that’s still not quite what I need to work. This year’s rumor suggests more Mac-like features. I’ve been playing with a Surface Pro 12 I bought, and I keep thinking, ’Man, it would be so nice to be able to do this on my M4 iPad Pro.’

Allison Johnson
Allison Johnson
Good morning from Apple Park.

The sun’s out and so are the iconic arches. Stay tuned for more soon.

Alex Heath
Alex Heath
We’re at Apple Park!

We were just driven around the spaceship in a golf cart that may or may not have had party speakers. The keynote is scheduled to start in less than an hour. More to come soon.

Alex Heath / The Verge
Hayden Field
Hayden Field
Apple’s new research paper says AI reasoning isn’t all it’s cracked up to be.

Right before WWDC 2025, Apple researchers published a paper called The Illusion of Thinking (PDF) that made waves. The researchers wrote that popular and buzzy AI models “face a complete accuracy collapse beyond certain complexities,” especially with things they’ve never seen before.

They presented models from OpenAI, Anthropic, and DeepSeek with new and complex puzzle games and found their reasoning ability “increases with problem complexity up to a point, then declines.”

Recent generations of frontier language models have introduced Large Reasoning Models (LRMs) that generate detailed thinking processes before providing answers. While these models demonstrate improved performance on reasoning benchmarks, their fundamental capabilities, scaling properties, and limitations remain insufficiently understood. Current evaluations primarily focus on established mathematical and coding benchmarks, emphasizing final answer accuracy. However, this evaluation paradigm often suffers from data contamination and does not provide insights into the reasoning traces’ structure and quality. In this work, we systematically investigate these gaps with the help of controllable puzzle environments that allow precise manipulation of compositional complexity while maintaining consistent logical structures.
Image: Apple
Apple is on defense at WWDCApple is on defense at WWDC
Allison Johnson