Why you shouldn’t fear Apple Intelligence - 10 minutes read






Judging by the reaction to Tim Cook’s post on X about Apple Intelligence, the internet is not ready for Siri integrated with ChatGPT. After Monday’s WWDC24 keynote, the Apple CEO posted a link to X about the new AI capabilities coming to Apple devices.


Immediately, dozens of critics — including Elon Musk — piled on. They slammed Apple for working with ChatGPT, which the vast majority of the commenters don’t trust.


“You’ve just ensured that no member of my family will EVER buy another Apple product,” wrote one. “Enjoy your spyfest!”


However, the hailstorm of surprisingly vitriolic and emotional comments seems based on a basic misunderstanding of how Apple Intelligence will work. As privacy-focused as ever, Apple put tons of work into making sure Apple Intelligence will keep your data secure, even from the prying eyes of AI companies it works with.


Here’s why it looks like Apple is doing AI right.


Apple Intelligence: AI with privacy in mind

“Apple Intelligence” is an umbrella term for a ton of AI-powered features built into Apple’s operating systems and apps. It will enable features as diverse as creating your own emoji (called “Genmoji“) based on photos of people in your Photos library to finding a list of recommended books in an old forgotten email, and much more besides.


AI is Silicon Valley’s hottest technology right now. It launched Nvidia stock into the stratosphere and made Microsoft relevant again. There’s a Cambrian explosion of AI startups, and any company — or industry — not wildly pivoting to AI is seen as an out-of-step laggard.


But AI comes with risks, from dangerously inaccurate hallucinations like recommending that you add glue to pizza to keep cheese from falling off to unpredictable self-driving cars that drag pedestrians under their wheels. Even seemingly innocuous AI features like Microsoft’s Recall, an AI assistant built into Copilot+ PCs that remembers everything, have been revealed as a privacy disaster.


Apple is acutely aware of these risks, so the company built what appear to be very robust safeguards into Apple Intelligence, which will be released this summer to beta testers and to the public in the fall.


On-device processing
The complex architecture of Apple Intelligence.Photo: Apple

Apple said Apple Intelligence is built with privacy in mind. The system is deeply personal — scanning emails, messages, calendar appointments, and hosts of other personal data — to answer questions, perform tasks and control apps. Apple calls it a “personal intelligence system.”


It can, for example, gather a host of information from diverse sources to make intelligent suggestions. During the WWDC24 keynote, Craig Federighi, Apple’s senior vice president of software engineering, said the system would be able to tell him if a rescheduled work meeting would clash with his daughter’s school play later that day — drawing on data like the time and place of both the play and the meeting, the distance between them, and predicted traffic.


“Understanding this kind of personal context is essential for delivering truly helpful intelligence,” Federighi said. “But it has to be done right. You should not have to hand over all the details of your life to be warehoused and analyzed in someone’s AI cloud.”


Your personal data yields better results while remaining private

For that reason, these kinds of tasks and queries are processed locally, on the device. This seemingly precludes data leaks and ensures future AI models aren’t trained on users’ personal data. (That’s not the case with competing AI models like Google’s Gemini.)


“It’s aware of your personal data without collecting your personal data,” Federighi said. “This is only possible through our unique integration of hardware and software and our years-long investment in building advanced silicon for on-device intelligence.”


Federighi said the system is made up of large language and diffusion models that are fed data from an on-device “semantic index,” which is built from information garnered from the user’s personal data.


“When you make a request, Apple Intelligence uses its semantic index to identify the relevant personal data and feeds it to the generative model so they have the personal context to best assist you,” Federighi said.


Private Cloud Compute
Apple software chief Craig Federighi outlines Apple Intelligence no-nos.Photo: Apple

However, a second class of queries may require larger models and machines with more horsepower than an iPhone, iPad or Mac. To answer those complex queries, Apple offloads the processing to Private Cloud Compute — its cloud-based server farm built with custom Apple silicon and a “hardened operating system designed for privacy,” Apple said in a paper on its Security Research website.


As the name says, Private Cloud Compute is a private system. Apple went to great lengths to keep it private and secure. According to Apple, PCC data isn’t accessible to anyone other than the user — not even to Apple.


“We believe PCC is the most advanced security architecture ever deployed for cloud Al compute at scale,” the paper says.


According to Cupertino, Apple Intelligence will ensure privacy by:



Custom hardware: This includes iPhone security features like Secure Enclave and Secure Boot.
No data retention: PCC servers do not retain any data, unlike competing AI cloud providers. “Your data is never stored or made accessible to Apple,” Federighi said.
No backdoors: Apple can’t access the data, even if it wanted to. Likewise, there’s no way for law enforcement or anyone else to gain access.
Verifiable transparency: Security researchers can inspect PCC software images and check Apple’s privacy assurances. “Private Cloud Compute cryptographically ensures your iPhone, iPad and Mac will refuse to talk to a server unless its software has been publicly logged for inspection,” Federighi said.

Clearly, Apple put a lot of advanced engineering and serious security research into Private Cloud Compute. The system is years in the making, and the result sounds very good.


“This sets a brand-new standard for privacy and AI,” Federighi said, “and unlocks intelligence you can trust.”


What about ChatGPT?
Apple Intelligence always asks the user before sharing any data with ChatGPT.Photo: Apple

Finally, a third subset of queries and tasks requires large language models trained on a wider corpus than a user’s emails. This is where Apple’s partnership with OpenAI (and potentially other AI companies) comes into play.


“There are other artificial intelligence tools available that can be useful for tasks that draw on broad world knowledge or offer specialized domain expertise,” Federighi said. “We want you to be able to use these external models without having to jump between different tools. So we’re integrating them right into your experiences.”


To answer more general questions or perform specific types of tasks that can’t be accomplished on-device or in Private Cloud Compute, Apple Intelligence will connect to OpenAI’s ChatGPT, one of the leading LLMs on the market.


During the keynote, Federighi gave a couple of example queries where ChatGPT might be useful: asking for recipe recommendations, or getting advice on what types of plants might thrive on a deck seen in a user’s photo. If given explicit permission, seemingly on a case-by-case basis, ChatGPT will be able to answer queries about the user’s documents, presentations, PDFs or photos. ChatGPT is also built into the updated Siri and Apple Intelligence’s Writing Tools, which can be used to create and modify content system-wide.


Apple Intelligence requires explicit consent for ChatGPT

To maintain privacy and keep the user in control of their data, Apple Intelligence asks the user for explicit permission before sending any queries, documents or photos to ChatGPT.


During the keynote, Apple showed an example of a request that Siri decided might be better handled by ChatGPT. A Siri popup plainly says, “Do you want me to use ChatGPT to do that?” and gives the user two options: cancel or use ChatGPT.


While ChatGPT gets millions of potential new users, it won’t be getting revenue directly from users: Apple said using ChatGPT will be free. It remains unclear if Apple will be paying OpenAI or, potentially, vice versa. (Google pays Apple billions of dollars to maintain its position as the iPhone’s default search engine.)


For users, the ChatGPT integration appears to be fairly limited:



ChatGPT won’t store queries. That means users’ data can’t be used for training future OpenAI models.
No ChatGPT account necessary: ChatGPT queries can be accessed for free without an account. However, OpenAI subscribers can connect their accounts for access to advanced features.
ChatGPT is an opt-in, completely optional feature: You won’t have to use it if you don’t want to. “You’re in control over when ChatGPT is used and will be asked before any of your information is shared,” Federighi said.

Other chatbots on the way?

Apple said it will be possible to add other chatbots, like Google’s Gemini, in the future. OpenAI apparently isn’t getting the kind of exclusive partnership that Google has as the default provider of search on iOS. (This could be a delaying tactic, providing users with advanced capabilities while Apple develops its own GPT 4o-level AI. Apple adopted a similar strategy with Google Maps while the company worked on its own Apple Maps.)


Elon Musk’s complaints about OpenAI being tightly integrated with iOS seem overblown. Musk threatened to ban Apple devices from his companies if iOS comes with ChatGPT built into the operating system. He also attacked Apple on X, deriding the company for its late entrance into the AI game.


“It’s patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security & privacy!” Musk wrote. “Apple has no clue what’s actually going on once they hand your data over to OpenAI. They’re selling you down the river.”


This seems like a basic misunderstanding of both Apple’s diligence in maintaining user privacy while incorporating AI into its products, and how Apple always approaches new technology.


Apple Intelligence: AI done the Apple way

First, while iOS 18, macOS Sequoia and iPadOS 18 all make it possible for users to tap into ChatGPT’s knowledge, the external AI is not tightly integrated into Apple’s platforms per se. Using ChatGPT remains entirely opt-in and OpenAI doesn’t gain even temporary access to any data that the user doesn’t explicitly share.


Second, artificial intelligence is clearly the technology of the moment. And, as usual, Apple took a conservative approach to incorporating it into its products. And it did so with the user’s experience and privacy top of mind.


Years ago, Apple co-founder Steve Jobs wisely said that to design great tech products, you need to begin with the user experience, not the trendy technology of the day. “You’ve got to start with the customer experience and work backward to the technology,” he said in 1997. “You can’t start with the technology then try to figure out where to sell it.”


‘AI for the rest of us’

Apple clearly demonstrated its commitment to this rock-solid design principle while introducing Apple Intelligence and the raft of new capabilities it will bring to iPhones, iPads and Macs.


All too often, the tech industry seems to be doing the opposite of what Apple is doing. For instance, AI companies come up with cool AI features, and then figure out how to sell them. But Apple took the extra step of figuring out how to use cool AI features to do things the customer might actually desire. And it did so while building in elaborate safeguards to protect users’ data at all cost.


In the end, it comes down to a basic question: Who do you trust to do AI right?


As you probably know, Musk has a couple of horses in this race. The Tesla founder runs X (formerly known as Twitter) and xAI, which developed a competing AI model. It’s called Grok and was built using Twitter data. Musk also co-founded OpenAI, but left — and sued — the company after a disagreement over the direction taken with its AI development.


Apple boasts a long record of putting user privacy first, even if that means abstaining from the latest buzzy tech until the company finds a way to implement it in an appropriate and safe way.


When it comes to a company using AI in a privacy-focused way, my money’s on Cupertino.




Source: Cult of Mac

Powered by NewsAPI.org