[Resource Topic] 2024/1471: Communication Efficient Secure and Private Multi-Party Deep Learning

Welcome to the resource topic for 2024/1471

Title:
Communication Efficient Secure and Private Multi-Party Deep Learning

Authors: Sankha Das, Sayak Ray Chowdhury, Nishanth Chandran, Divya Gupta, Satya Lokam, Rahul Sharma

Abstract:

Distributed training that enables multiple parties to jointly train
a model on their respective datasets is a promising approach to
address the challenges of large volumes of diverse data for training
modern machine learning models. However, this approach immedi-
ately raises security and privacy concerns; both about each party
wishing to protect its data from other parties during training and
preventing leakage of private information from the model after
training through various inference attacks. In this paper, we ad-
dress both these concerns simultaneously by designing efficient
Differentially Private, secure Multiparty Computation (DP-MPC)
protocols for jointly training a model on data distributed among
multiple parties. Our DP-MPC protocol in the two-party setting
is 56-794× more communication-efficient and 16-182× faster than
previous such protocols. Conceptually, our work simplifies and
improves on previous attempts to combine techniques from secure
multiparty computation and differential privacy, especially in the
context of ML training.

ePrint: https://eprint.iacr.org/2024/1471

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .