[Resource Topic] 2023/1320: Practical Privacy-Preserving Machine Learning using Fully Homomorphic Encryption

Welcome to the resource topic for 2023/1320

Title:
Practical Privacy-Preserving Machine Learning using Fully Homomorphic Encryption

Authors: Michael Brand, Gaëtan Pradel

Abstract:

Machine learning is a widely-used tool for analysing large datasets, but increasing public demand for privacy preservation and the corresponding introduction of privacy regulations have severely limited what data can be analysed, even when this analysis is for societal benefit.
Homomorphic encryption, which allows computation on encrypted data, is a natural solution to this dilemma, allowing data to be analysed without sacrificing privacy.
Because homomorphic encryption is computationally expensive, however, current solutions are mainly restricted to use it for inference and not training.

In this work, we present a practically viable approach to privacy-preserving machine learning training using fully homomorphic encryption.
Our method achieves fast training speeds, taking less than 45 seconds to train a binary classifier over thousands of samples on a single mid-range computer, significantly outperforming state-of-the-art results.

ePrint: https://eprint.iacr.org/2023/1320

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .