A Look at Apple’s AI Data Security Measures. The fact that Apple is developing AI functionality for iOS 18 and macOS 15 is not news. You might hear Siri seem more human-like or have the ability to create emojis from the content of your Messages conversations when you upgrade your Mac, iPad, and iPhone later this year.
There have been rumours that Apple aims to outsource much processing to the cloud. Still, there are claims that the corporation will run many of these capabilities on-device, at least with its latest models. As far as the industry is concerned, that is typical: The cloud is now hosting the majority of AI processing because of how intensive AI processing is. For this reason, businesses are constantly innovating new features for their NPUs, which are dedicated processors for AI tasks.
Microsoft launched a new AI-PC standard with its Copilot+ PC line, while Apple highlighted the new M4 chip’s powerful NPUs earlier this year. Apple has been using NPUs for years.
Running AI on Device is More Secure
Assuming the AI functionality is functioning correctly, whether it is operating on your phone or in the cloud is likely irrelevant to you. Problematically, these functionalities offer an intrinsically more secure experience when operated on-device. The service performing the processing may need to decrypt user data beforehand, which increases the risk of data exposure when firms move processing to the cloud. Apple’s AI Data Security Measures: The company’s workers and malicious third parties attempting to breach the company’s cloud servers to steal client data are potential sources of exposure.
This is why I warn against giving most cloud-based AI services access to your data, as it is already a problem with services like ChatGPT. These servers are using your conversations—both private and public—to store data and build an artificial intelligence model. Regarding user data, companies like Apple that prioritize user privacy often opt for on-device solutions. This is because they can prove that data stays out of the wrong hands when users keep it isolated to their devices.
How Apple will use ‘Secure Enclave’ to Protect AI data
Though its latest smartphones should be more than capable of running the AI capabilities it is developing, Apple might have to resort to using servers in the cloud if it wants to make those functions available to older devices or if they are too power-hungry. Android Authority cites a story from The Information that suggests the corporation may have discovered a solution: the Secure Enclave.
Today, the Secure Enclave is built into the hardware of most Apple products. This discrete component of the System on a Chip (SoC) is responsible for storing your biometric data and encryption keys, among other sensitive information. The Secure Enclave protects data from a hacking main CPU.
The information reports that Apple is developing an AI-cloud solution enabling the secure enclaves of M2 Ultra and M4 Macs operating on its server farms to receive all user data about AI. The user would receive the results after those server Macs process the request while maintaining encryption. Following these steps would protect users’ data while their older devices may benefit from Apple’s cutting-edge AI capabilities.
We won’t know if this is Apple’s intention until they announce their plans at WWDC. Nobody can ever know how Apple will secure user data associated with AI if the company remains mum. Given Apple’s claims about its commitment to customer privacy, this method—or one that guarantees end-to-end encryption of data stored in the cloud—would be highly prudent.