Abstract

Isolation of processes within operating systems enables portions of an application to run in a secure privacy-compliant sandbox. However, isolated processes are required to be stateless. Statelessness implies that no data can be stored, even for valid use cases such as federated learning. This disclosure describes techniques to perform federated learning within secure isolated data-processing environments and to share the results of federated learning via provably private means. A trusted trainer is deployed within the same secure sandbox as a trusted processor. The trusted processor requests the operating system to store data in a secure private cache inaccessible to the trusted processor and to the host application. The trusted trainer runs federated learning on data in the secure, private cache. The trusted trainer shares the results of its computation (e.g., updated machine-learning models) via provably private techniques, without compromising the security, confidentiality, or identity of training data.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS