Apple’s AI in iOS 18 may be a game changer. Here’s why.
TL;DR: iOS 18 LLM
- What does LLM stand for? LLM is short for large language model. 💬
- What does an LLM do? It powers AI chatbots like ChatGPT. 🤖
- Does Siri use an LLM? Not right now, but in iOS 18 it may. 📱
- How will Apple’s LLM be different? It may run directly on the iPhone instead of in the cloud. ☁️
Nearly every major tech company has jumped on the AI bandwagon in the past few years. As a matter of fact, it’s hard to find a major tech giant who isn’t shouting about how the future of the company is all about generative AI and AI chatbots—except for Apple.
While Google, Microsoft, Amazon, Meta, and others have been brandishing their AI crews since 2022, the iPhone maker is yet to publicly embrace AI by releasing a slew of its own tools. But that is about to change. At June’s Worldwide Developer’s Conference, Apple is expected to unveil several major AI tools coming to the iPhone’s operating system iOS 18.
One of the biggest new features is widely expected to be a built-in large language model (LLM), the technology that powers chatbots like ChatGPT. Now we don’t know if this means Apple will unveil a completely new chatbot, or whether Apple will revamp Siri to turn it from a digital assistant into a full-blown AI chatbot (but we’re thinking it will be the latter).
And now Bloomberg has a report out that suggests Apple’s LLM may be the LLMs from OpenAI and Google on two major fronts: Apple’s may be faster and more private since it may be able to run on the iPhone itself instead of on cloud servers. As Bloomberg reports:
“Apple has been developing a large language model — the algorithm that underpins generative AI features — and all indications suggest that it will be entirely on-device. That means the technology is powered by the processor inside the iPhone, rather than in the cloud. The upshot: Apple’s AI tools may be a bit less powerful and knowledgeable in some cases (the company could fill in the gaps by teaming up with Google and other AI providers), but the approach will make response times far quicker. And it will be easier for Apple to maintain privacy.”
Bloomberg
To be sure, an on-device LLM that run entirely on a user’s iPhone would be a huge leap forward for the industry. Current chatbots run on the cloud, which make the less private because the user’s requests need to be sent to the tech giant’s servers for processing. And because the user’s requests need to be sent off and crunched on the cloud, existing chatbots can be slow to respond to user queries.
But by running its LLM directly on a user’s iPhone, Apple can make its AI chatbot much more private when it comes to processing user data since that data doesn’t need to leave the user’s iPhone to be processed remotely. Also, since that data doesn’t need to be sent off, the iPhone’s LLM will be able to return answers much more quickly.
However, as Bloomberg points out, there could be a drawback to this: Apple’s LLM might be less knowledgeable than existing ones. In that case, it’s a tradeoff between the LLM’s intelligence and its speed and privacy. But as Bloomberg notes, Apple’s LLM could send some requests to Google or OpenAI’s LLMs to get around these drawbacks.
Besides a built-in LLM coming to iOS 18, the iPhone’s next operating system update is expected to come with new home screen functionality, improvements to the Notes app, and a new terrain map in Apple Maps.
Leave a Reply