Apple-Google Partnership: What We Still Don't Know
[Note: After I published this article earlier today, Bloomberg reported that there may be delays in features for Apple's Siri AI upgrade. At this point, all Apple has said is that an enhanced Siri will be coming this year. If there is any other official information from Apple or Google, I will post an update to this article.]
February 11, 2026
Over the past two weeks, the Internet has been filled with rumors of Apple's OS 26.4 update and an AI powered Siri. While some of those rumors might turn into reality, there is much we still don't know about the new Siri. In this article, I talk about what we do know from official Apple and Google sources, raise questions about what we don't know, and show some examples of what the new Siri might be able to do. Although we don't know what Apple might call this new version of Siri, in this article, I'm referring to it as Siri AI.
There is obviously no way to test Siri AI prior to a beta release, but I'll give you a preview of what it might be like in the third section of this article. My main device is an iPad but I also have a Lenovo (Android) tablet that uses the Gemini assistant so I took some screenshots on that tablet of activities I had Gemini do for me. Those screenshots might give you a taste of what might be coming to your Apple devices. But first, let's see what Google and Apple have said and not said about their agreement.
Apple-Google Partnership: What We Know
There have been only three official statements from Apple and Google concerning their AI partnership. The first, on January 12, 2026, was a joint statement released by Google. Tim Cook made an almost identical statement at Apple's January 29th earnings call. I listened carefully to the call and I will include some of Cook's comments in this section. The third official statement came from Sundar Pichai, CEO of Google, at Google's earnings call on February 4th.
So what is the official word on this agreement?
Joint Statement, January 12, 2026
[Statement link]
"Apple and Google have entered into a multi-year collaboration under which the next generation Apple Foundation Models will be based on Google's Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year."
Tim Cook, Earnings Call, January 29, 2026
[available on Apple Podcasts until February 12, 2026]
"Building on our efforts in the AI space, we are also collaborating with Google to develop the next generation of Apple Foundation Models. This will help power future Apple Intelligence features, including a more personalized Siri coming this year."
Pichai Sundar, Earnings Call, February 4, 2026
[Earnings Call link]
"I'm pleased that we are collaborating with Apple as their preferred cloud provider and to develop the next generation of Apple Foundation Models, based on Gemini technology."
As you can see, there are some subtle differences in the statements coming from Apple and Google. I'm not going to speculate on the meaning of those differences, but I want to point out that Cook, in his earnings call, never mentioned "Gemini" or "Google cloud," but he did emphasize "collaboration" with Google. What that collaboration means is still unclear.
Do we know anything else? One point that is clear from both the joint statement and Cook's statement is that Apple Intelligence will run on Apple devices and servers. It is not clear if that is also true for Siri AI.
Joint Statement, January 12, 2026
"Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."
Tim Cook, Earnings Call, Question and Answer Session, January 29, 2026
"We'll continue to run on the device and run in Private Cloud Compute and maintain our industry leading privacy standards in doing so."
"You should think of what is going to power the personalized Siri as the collaboration with Google."
Apple-Google Partnership: Unknowns
In this section, I raise questions primarily concerning Siri AI since this will probably be the major upgrade that involves Apple's collaboration with Google. It is also the part that is ambiguous in the official statements. Tim Cook, in the earnings call, when asked about Apple's integration with third-party AI models, answered "You should think of it as a collaboration. And we'll obviously do some of our own stuff. But you should think of what is going to power the personalized version of Siri as the collaboration with Google."
Does this mean Google is helping Apple develop Siri AI? What involvement does Google have? How is the Google Gemini model being used? Will Siri AI resemble Gemini or will Apple restrain/redesign the model to fit Apple's vision? Do Apple and Google have different expectations for the collaboration?
Some of these questions may never be or need to be answered publicly and others may be answered as the progress on Siri AI evolves.
The above questions are the more general issues that will be resolved over time, and not the more immediate ones users may have. The last question asked on the Apple earnings call may be the most important one for users: What portion of devices are AI capable? This question was answered, or more accurately, not answered by Kevan Parekh, CFO of Apple: "We're not going to provide a specific figure on that today."
This answer raises many more questions:
1. Which devices will be capable to run full Siri AI implementation? What chip is needed? How much RAM is needed? How much device storage is needed?
2. Will device heat be a factor when using Siri AI? What about battery consumption?
3. Will there be features of Siri AI that will only run on certain devices?
4. The Siri AI that first comes in an OS 26 update will probably be a partial rollout of Siri AI, but will some devices get new features before other devices?
5. What happens to devices that can run OS 26 but aren't AI or Siri AI capable? Will older devices get the old Siri, no Siri, a reduced Siri AI?
6. Which countries and languages will get Siri AI? What will be the rollout and extent?
The first OS release with new Siri AI features may answer some of these questions, but others may not get answered until WWDC or later.
Another area of interest for users may be Siri's future expansion. Will Siri AI be integrated into the operating system? If so, which features will be Siri AI enabled — search, settings, accessibility? Will Siri become a full-blown chatbot like Gemini is on Android or will it be more focused just on executing tasks?
A major concern for Apple and many users is privacy. This was clearly addressed for Apple Intelligence in the official reports, but vague statements were made for Siri AI. A question that needs answering is "What exactly are the privacy measures for Siri AI?"
Cook was asked in the earnings call: "When you think about how Apple might manage AI, do you see that as evolving toward more edge AI or on-device services versus cloud based AI?" Cook's answer was "We see both being important, the on-device and the Private Cloud Compute so we don't see it as an either-or, we see it as a both." This answer raises questions about cloud storage:
1. Where will interactions with Siri AI be stored? What will remain on the device and what will be stored in the cloud? Will there be a Siri app?
2. Will iCloud storage be needed for Siri AI? If so, how much storage?
An interesting question on the earnings call on monetizing AI came with a vague answer from Cook: "We're integrating it [intelligence] across the operating system... And I think by doing so it creates great value and that opens up a range of opportunities across our products and services."
With Apple's increased use of ads in the App Store and its recent release of the Creator Studio subscription, Cook's answer raises questions about Siri's AI. Will Apple try to monetize Siri AI with tiered levels? Will Siri AI remain free for all users on capable devices or will there be subscription models for advanced features?
At the Apple earnings call, Richard Kramer of Arete Research asked a critical question that users may not initially see as relevant to them, but it may be the most important question, especially considering Cook's answer.
Kramer: "Are you confident you've reserved sufficient data center capacity to support the widespread Siri adoption?"
Cook: "In terms of do we have enough capacity, it's hard to estimate with precision what the demand will be. But we've done the sort of the best job we can do. And either have or are putting capacity in for it."
What might happen if Apple does not have enough capacity to handle Siri AI requests from users? Will devices hang/freeze or reboot? Will Apple need to use Google's cloud technology instead of their own servers?
One can only hope Apple works out that issue before the full version of Siri AI is enabled by massive numbers of users.