Why Apple Intelligence and Private Cloud Compute are so different and secure
Learn about the security and privacy controls in Apple Intelligence and how Apple’s revolutionary approach to designing the Private Cloud Compute infrastructure sets it apart from AI models hosted in cloud-based environments. Furthermore, understand how Apple succeeds where others are subject to common data security vulnerabilities and user privacy risks.

Introduction
Apple Intelligence is the personal intelligence system that harnesses the power of Apple silicon to understand and create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks while taking an extraordinary step forward for privacy in AI. For Apple users, the future of personal and professional use is full of endless possibilities when leveraging the generative models on-device anywhere they go. For larger, more complex workloads, Apple Intelligence relies on large foundation models contained within Private Cloud Compute, or PCC, Apple’s revolutionary cloud intelligence system custom-designed specifically to process users’ AI requests privately.
With security and privacy serving as a pillar of Apple’s design process for the hardware, software and services they create, Mac admins and security professionals understandably have questions relating to unknown risks and data security concerns common to adopting any cloud-based solution.
But before IT/Security teams make the decision to “turn off Apple Intelligence” as a knee-jerk reaction to an abundance of concern for enterprise security, we urge administrators to take a closer look to:
- Understand what Apple Intelligence and PCC are
- Recognize what sets them apart from other cloud-based AI offerings
- Learn about the transparent security and privacy controls Apple has built in
In short, Apple took a “think different” approach to cloud-based computing by building a cloud intelligence system designed with power and performance at its core, while keeping the focus on user privacy and data security from end-to-end.
Common AI model security risks
Industry best practices and conventional wisdom may champion keeping data stored locally on devices to ensure it remains confidential and privacy remains, well private. That said, changing ways of how and where work is performed has given rise to cloud-based technologies, like collaboration tools, storage and AI models. It’s critical for enterprise administrators to understand that risks introduced by using cloud-based solutions are inherent to the architecture used in current cloud-based delivery systems and how they’re designed.
To that point, IT/Security teams have every right to be concerned. Risks should be evaluated in order for organizations to uphold a strong security posture and maintain compliance. Some risks common to cloud-based solutions are:
- Lack of visibility into supply-chain attestation of the entire stack
- Hardware and software misconfigurations introduce unknown vulnerabilities
- Accidental data exposure to unauthorized third parties through data commingling
- Insecure APIs failing to sanitize inputs or encrypt data lead to cross-platform vulnerabilities
- Privileged access credentials are targeted by threat actors because restrictions are difficult to enforce
- Service providers mine privacy information from data shared with AI models for monetization
- Data and security implications of sharing data in uncontrolled environments
What makes Apple Intelligence different?
For starters, Apple Intelligence is designed with privacy and security first. By default, the service — and the infrastructure it runs on — supports the core design tenants by placing control over data in the hands of users from end-to-end.
But there is so much more that goes into ensuring that Apple Intelligence and Private Cloud Compute operate safely and securely.
Supply chain managed from end-to-end
As the designer and manufacturer of the hardware and software that powers all the devices and services in their ecosystem, Apple ensures strict adherence to privacy and security over competing products that outsource various components, chipsets and software to third-party manufacturers. This minimizes the implications of sharing your confidential data with uncontrolled environments and extends to Apple Intelligence on-device but also when data requires complex processing via PCC.
The architecture behind Private Cloud Compute is designed by Apple. Because Apple Intelligence and PCC run on Apple hardware and software, the entire stack is both verifiable but also affords unprecedented control across the entire workflow. To summarize, it’s a fully integrated and end-to-end solution, all the way from silicon to the cloud services being used from device request to transmission of data to processing and finally to delivery of results.
Users are always in control of their data
Since Apple Intelligence is designed for on-device processing by default, data remains safely within a user’s device for smaller requests. Complex requests sent for processing by large foundation models receive the same level of security and privacy controls. They extend from on-device to Private Cloud Compute to ensure that user data:
- is used exclusively for the purpose of fulfilling the user’s request
- must never be available to anyone, at any time, other than the user;
- not during active processing or even Apple staff and administrators
Independent assessment and verification by third parties
The door is open to prove Apple’s claims regarding the security and privacy claims of Apple Intelligence and PCC. In their own words, “security researchers must be able to verify the security and privacy guarantees of Private Cloud Compute, and they must be able to verify that the software that’s running in the PCC production environment is the same as the software they inspected when verifying the guarantees.” Security researchers will have access to an identical image of the software that drives PCC to audit and verify independently of Apple, providing attestation of Apple Intelligence’s security and privacy claims.
Multiple security controls mean no one — not even Apple — can read your data
Processing requests made on-device keeps data secure while privacy is upheld by virtue of the data being contained within the device itself. Because data never leaves the device, the risk of exposure is mitigated.
What about when requests require processing by large language models hosted outside the confines of the device? Typically, cloud-hosted services like iMessage employ end-to-end encryption to secure data from one device to the other over a network. However, because AI models require unfiltered access to the data it’s processing, end-to-end encryption cannot be utilized in the same manner to keep data privacy and confidentiality intact.
To address this shortcoming, Apple has evaluated each phase of the process and redesigned it into a workflow that ensures data security and privacy from request to delivery of results. Some of these controls are:
- Requests are encrypted on the user’s device using the public keys of the PCC nodes after they are confirmed valid and cryptographically certified.
- End-to-end encryption is used between the device and the validated PCC node, ensuring that requests cannot be read by anyone other than the PCC node during transit.
- Keys are not shared or made available to other equipment within the infrastructure, such as those providing data center services running outside the trust boundary, enforcing security and privacy guarantees.
- On the PCC nodes themselves, Secure Boot and Code Signing are used to limit access to any keys used to decrypt user requests to the specific PCC node authorized and cryptographically verified to handle your request.
- All code that can run on the node must be part of a trust cache that has been signed by Apple,
- approved for that specific PCC node,
- and loaded by the Secure Enclave
- code cannot be changed or amended at runtime,
- also, code mappings cannot be created, prevent compilation or inject new code at runtime.
- The same integrity protection that powers the Signed System Volume is used to secure all code and language model assets.
- Keys are stored within the Secure Enclave, enforcing the guarantee that keys used to decrypt requests cannot be duplicated or extracted.
See Apple’s documentation for additional details on these controls.
Native, trusted processes ensure secure access to and processing of data every time
How does Apple ensure that data remains private and secure? Stateless computation.
Designing and managing the entire PCC infrastructure allows Apple to abide by strict security and privacy standards before a request is made, during data processing and after the results have been delivered to the user.
- Beginning with the Apple device itself, data is stored and requests are created on-device using only software and hardware that is cryptographically verified by Apple.
- Requests are transmitted to the PCC compute node, which is used exclusively for the purpose of fulfilling the user’s request.
- PCC compute nodes have technical enforcement for the privacy of user data during processing and only perform the operations requested by the user.
- Upon fulfilling a user’s request, PCC deletes the user’s data. Writing to storage is removed from PCC compute nodes to prevent storing user data.
- User data is not retained in any form after the response is returned, this includes capture by logging or debugging tools.
In short, Apple created a strong form of stateless data processing where personal data leaves no trace in the PCC system.
Your physical device is the lock and the key — not weak passwords
One key target for threat actors is access credentials. From a security standpoint. Why spend countless resources trying to exploit a vulnerability on a device when a social engineering campaign can yield a user’s credentials with less effort and greater success rates? Not to mention that enforcing access restrictions based on credentials is difficult to monitor and even harder to manage data security and user privacy.
With Apple Intelligence and Private Cloud Compute, your Apple hardware device is both the lock and key that keeps your data and privacy safeguarded — not passwords that can be brute-forced or obtained through other means. For example, an iPhone with Face ID enabled and a strong passcode not only keeps data stored within the volume safe through encryption but also prevents unauthorized users from creating requests to Apple Intelligence, unlike cloud-based AI models accessible by anyone from anywhere so long as credentials are authenticated.
Layered security controls mitigate vulnerabilities at every turn and pass
At the core of the security and privacy guarantees provided by Apple rests in the design, implementation and supply chain that makes up the underlying infrastructure that drives Apple Intelligence. Private Cloud Compute uses Apple-designed hardware and software, allowing Apple to maintain security and privacy guarantees that extend holistically across the entire workflow — from the initial request on a user’s device to network transmission to processing within PCC to delivery of results.
This tight integration allows greater and more granular management over data security and user privacy while minimizing the introduction of vulnerabilities, such as misconfigurations stemming from unknown variables of using first and third-party hardware and software solutions.
Another critical concern that Apple has addressed by creating its own architecture is that the lack of visibility into a provider’s supply chain requires organizations to implicitly trust that solutions are secure without the ability to verify claims made by providers. To address this, Apple designed PCC for transparency, inviting security researchers to assess risk and evaluate PCC. By peeling back the veil on PCC, independent researchers and organizations alike are able to verify the data security and user privacy guarantees of Apple Intelligence for themselves to a degree that cloud-based providers simply do not offer.
What is the benefit of performing your own risk assessment of Apple Intelligence? Simply put, by verifying Apple Intelligence’s guarantees and comparing evaluations to their own risk assessment, enterprises open themselves to far less known risk than other forms of AI and cloud computing platforms.
Key Takeaways
- Apple manages the supply chain for Apple Intelligence and Private Cloud Compute — relying on Apple-designed hardware and software — not third-party solutions that introduce unknown vulnerabilities
- Apple Intelligence processes requests on-device for the utmost level of user control over their data and privacy
- PCC is an extension of physical devices. As far as security and privacy are concerned, only the processing power is different but the guardrails remain the same
- There is no delineation between work data and personal — data stewardship extends equally to all data types.
- Apple includes keys that IT may configure to turn off individual application capabilities that rely on Apple Intelligence and PCC
- With PCC, Apple has opened the doors to prove them wrong by making it available for researchers and enterprises to perform risk assessments to verify security and privacy guarantees independently
- No one — not even Apple or its staff — can read your data before, during or after processing
- Strict confidentiality and integrity policies require that user’s data only be used for processing their request and cannot be modified or have code injected or prevented from running
- Apple has designed PCC to operate as Stateless Computation, meaning that your data is never stored and once a request has been fulfilled, no trace of your data is left behind
- Data resides, is transmitted and processed only within the trust boundary between Apple hardware, software and PCC — your device is the key — not passwords that are weak or trivial to bypass
Don’t want to just take our word for it? That’s ok; Apple + Jamf is here to help enterprises secure and manage your devices.
Read detailed security research about Apple Intelligence and Private Cloud Compute before assessing both for yourself.