Apple Turns to Google's Gemini for Enhanced Siri in 2026
There’s a Gemini light at the end of the iPhone AI tunnel. Image generated by Gemini, naturally.
Apple has been challenged by advances in AI that have powered experiences on phones (Android), PCs (Windows), and the digital home (Amazon’s Alexa ecosystem) that it has not been able to match with internally developed models. As I predicted, this hasn’t hurt Apple’s sales yet — the iPhone 17 family has been a hit, Apple’s Macs, iPads, and wearables sold well over the holidays, and Apple continues to set records in its Services division. Apple’s ecosystem is durable, and some rival AI features don’t work well yet or require users to be all-in on a specific ecosystem.
However, Apple made AI promises when it advertised the iPhone 16 that Siri couldn’t keep and had to delay enhanced Siri. AI capabilities across the industry are advancing rapidly, with Google’s latest Gemini models providing astonishingly good at providing advice, understanding context, and creating images. Consumer expectations for devices and ecosystems are beginning to include AI capabilities that Apple has not been able to deliver. That threatens Apple’s lock-in with consumers, so Apple has shaken up management and made a deal with Google to use Gemini (another accurate prediction).
Apple is now building its own Apple Foundation Models based on Google's Gemini that will power a more advanced version of Siri "later this year." That timing is earlier than I would have expected, but who knows how long this has been in the works? Here’s the text of the announcement from Google’s X news feed (updated later as a joint statement with Apple):
Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year. After careful evaluation, Apple determined that Google's Al technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users. Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards.
This means that Siri is about to get a lot smarter, and that Apple is plugging a major hole in its ecosystem. This does NOT mean iOS and Google will have the same AI features. This also isn’t Apple adopting the Gemini app — that’s been available from Google in the Apple App Store since November 2024. This is Apple using Google’s Gemini models as a foundation for building various Apple Foundation Models that power AI features. The resulting models will be used on device (on Apple Silicon) and in the cloud (Apple’s Private Cloud Compute). I expect there will be a significant amount of differentiation in how Gemini-based Apple Foundation Models get implemented compared to how Gemini is integrated on Android. Apple will have its own ideas when it comes to what small models are embedded on the phone versus the cloud, and how AI is integrated into the phone's user interface, first party apps, and APIs for third party app developers.
In the short term, nothing is changing with how Siri offers ChatGPT as an option for some queries and actions. Those options may remain even after Apple delivers Gemini-based Apple Intelligence later this year, as some consumers may have model and ecosystem preferences. Plus, the industry is moving so quickly that retaining flexibility could end up being a strength for Apple.
For Techsponential clients, a report is a springboard to personalized discussions and strategic advice. To discuss the implications of this report on your business, product, or investment strategies, contact Techsponential at avi@techsponential.com.