Apple creates Private Cloud Compute VM to let researchers find bugs
Apple created a Virtual Research Environment to allow public access to testing the security of its Private Cloud Compute system, and released the source code for some “key components” to help researchers analyze the privacy and safety features on the architecture.
The company also seeks to improve the system’s security and has expanded its security bounty program to include rewards of up to $1 million for vulnerabilities that could compromise “the fundamental security and privacy guarantees of PCC.”
Private Cloud Compute (PCC) is a cloud intelligence system for complex AI processing of data from user devices in a way that does not compromise privacy.
This is achieved through end-to-end encryption, to ensure that personal data from Apple devices sent to PCC is accessible only to the user and not even Apple can observe it.
Shortly after Apple announced PCC, the company gave early access to select security researchers and auditors so they could verify the privacy and security promises for the system.
Virtual Research Environment
In a blog post today, Apple announces that access to PCC is now public and anyone curious can inspect how it works and check if it rises to the promised claims.
The company makes available the Private Cloud Compute Security Guide, which explains the architecture and technical details of the components and the way they work.
Apple also provides a Virtual Research Environment (VRE), which replicates locally the cloud intelligence system and allows inspecting it as well as testing its security and hunting for issues.
“The VRE runs the PCC node software in a virtual machine with only minor modifications. Userspace software runs identically to the PCC node, with the boot process and kernel adapted for virtualization,” Apple explains, sharing documentation on how to set up the Virtual Research Environment on your device.
VRE is present on macOS Sequia 15.1 Developer Preview and it needs a device with Apple silicaon and at least 16GB of unified memory.
The tools available in the virtual environment allow booting a PCC release in an isolated environment, modifying and debugging the PCC software for a more thorough scrutiny, and perform inference against demonstration models.
To make it easier for researchers, Apple decided to release the source code for some PCC components that implement security and privacy requirements:
- The CloudAttestation project – responsible for constructing and validating the PCC node’s attestations.
- The Thimble project – includes the privatecloudcomputed daemon that runs on a user’s device and uses CloudAttestation to enforce verifiable transparency.
- The splunkloggingd daemon – filters the logs that can be emitted from a PCC node to protect against accidental data disclosure.
- The srd_tools project – contains the VRE tooling and can be used to understand how the VRE enables running the PCC code.
Apple also incentivizes research with new PCC categories in its security bounty program for accidental data disclosure, external compromise from user requests, and physical or internal access.
The highest reward is $1 million for a remote attack on request data, which achieves remote code execution with arbitrary entitlements.
For showing how to obtain access to a user’s request data or sensitive info, a researcher can get a bounty of $250,000.
Demonstrating the same type of attack, but from the network with elevated privileges, comes with a payment between $50,000 and $150,000.
However, Apple says that it considers for rewards any issues that have a significant impact on PCC, even if they are outside the categories in its bug bounty program.
The company believes that its “Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale” but still hopes to improve it further in terms of security and privacy with the help of researchers.
Source link