![]()
Apple has spent the last two years telling users that Apple Intelligence is its vision for the iPhone’s future. But one detail buried in the latest Bloomberg report completely changes how that vision actually works: when iOS 27 arrives, users will be able to swap out the AI powering Siri and other features entirely, choosing between Google’s Gemini, Anthropic’s Claude, and more. That is a bigger deal than most headlines have made it out to be, and there is a layer to this story that most coverage glossed over entirely.
iOS 27 is shaping up to be one of the most significant software releases Apple has shipped in years, and this AI model choice feature is right at the center of it. The system is internally called “Extensions,” and it has the potential to quietly reshape the entire smart assistant landscape — on Apple’s terms.
What iOS 27 Extensions Actually Are
According to a Bloomberg report by Mark Gurman, Apple is building a feature called Extensions into iOS 27, iPadOS 27, and macOS 27, all set to arrive this fall.
In test versions of the software, Apple describes Extensions as a system that allows users to “access generative AI capabilities from installed apps on demand, through Apple Intelligence features such as Siri, Writing Tools, Image Playground,d and more.” The language here is deliberate: on demand. This is not a background setting that users will set once and forget. It is a framework that Apple is designing to sit inside the familiar Settings interface, letting people assign preferred AI models to specific features or tasks.
I’ve been following this for a while, and honestly, this is not the typical Apple move. Apple is famously reluctant to open its platforms to rivals in any meaningful way. The fact that Anthropic’s Claude and Google’s Gemini are already being tested inside the Extensions system — confirmed by multiple sources, including Bloomberg and 9to5Mac — signals that Apple has made a calculated decision to prioritize flexibility over control this cycle.
The practical mechanics are straightforward. AI companies like Google and Anthropic can add Extensions support through their existing App Store applications. Once you install the Gemini or Claude app, those models become available as selectable engines inside Apple Intelligence. From there, you could route a Writing Tools edit to Claude, send an Image Playground request to Gemini, or handle a Siri query with whatever model you trust most for that task. Reports also suggest Apple will introduce a dedicated section in the App Store specifically to highlight compatible AI apps, making it easy to find and install supported models.
iOS 27 and the Model Hierarchy Nobody Mentioned
Here is the part that most articles missed. The Extensions system does not replace Apple’s existing AI arrangements — it layers on top of them. Google’s Gemini already holds a privileged native position inside Apple Intelligence, backed by a multi-year partnership and a reported annual payment of around $1 billion. That deal, announced in January 2026, means Gemini is baked into the rebuilt Siri at a foundational level. ChatGPT, meanwhile, has been part of the picture since iOS 18, handling world-knowledge fallback queries and Image Playground requests.
What iOS 27 adds is an opt-in layer sitting above that existing structure. Extensions open the door to Claude, xAI’s Grok, and potentially regional models like DeepSeek for users in China. But the default experience — the one most users will never change — is still likely to be shaped heavily by Apple’s native deals. Industry insiders hint that Apple’s UI design and default configurations at launch will determine just how meaningful this choice actually is for the average person who never touches Settings.
After looking into this more closely, I can tell you that this is not Apple flattening the AI landscape. It is Apple building a customizable layer for enthusiasts while keeping its existing partnerships fully intact. Whether that is the right call depends entirely on how transparent Apple makes the defaults at WWDC.
The iOS 27 Voice Feature You Probably Missed
One of the most interesting details in the Bloomberg report is a voice customization feature that barely made it into coverage. According to the report, iOS 27 will let users assign different Siri voices to different AI models. This means a query handled by Apple’s own on-device system could use one voice, while a response from Claude could use a completely different one. When an outside model handles your Siri query, you will hear, not just read, which AI is answering.
This is one of those things I genuinely got excited about the moment I saw it. It sounds like a small design detail, but the implications are real. Today, every Siri interaction feels like one undifferentiated experience. With iOS 27, the voice distinction would make it immediately obvious when an outside model has taken over a response. That transparency, if Apple actually delivers it, would be a meaningful leap toward giving users real situational awareness about what AI is doing things on their behalf.
What This Means for Apple Intelligence Going Forward
The timing of this announcement matters. Apple publicly delayed its AI-upgraded Siri after admitting the work was, in the company’s own words, “taking longer than we thought.” The company also replaced AI chief John Giannandrea with Vision Pro head Mike Rockwell, signaling a reset at the top. Allowing third-party models to fill gaps in the meantime is a pragmatic move, and it fits the pattern. iOS 18 added ChatGPT because Apple needed external help with world knowledge. iOS 27 extends that same logic across the entire Apple Intelligence stack.
Sources suggest that WWDC on June 8 will be when Apple officially unveils Extensions and the full scope of iOS 27 AI features. If the current trajectory holds, it looks like developer betas will follow shortly after, with the public release arriving in September alongside new hardware. Apple’s stock rose more than 2% following the Bloomberg report, and Alphabet also gained on expectations of a deeper partnership. The market clearly reads this as a win for both companies, not a competitive threat.
Personally, I think the real test will come six months after launch. If most iPhone users are routing tasks through Claude or Grok by early 2027, that would genuinely prove that Extensions changed behavior at scale. If the majority stay on defaults, it will confirm what sceptics suspect: that Apple opened the door to choice without actually pushing users through it. Either way, iOS 27 is setting up the iPhone as a multi-model AI platform, and that is a category shift worth watching closely.