[Resource Topic] 2023/493: Force: Making 4PC > 4 × PC in Privacy Preserving Machine Learning on GPU

Welcome to the resource topic for 2023/493

Title:
Force: Making 4PC > 4 × PC in Privacy Preserving Machine Learning on GPU

Authors: yufan jiang, Yong Li

Abstract:

Tremendous efforts have been made to improve the
efficiency of secure Multi-Party Computation (MPC), which
allows n ≥ 2 parties to jointly evaluate a target function
without leaking their own private inputs. It has been confirmed
by previous researchers that 3-Party Computation (3PC) and
outsourcing computations to GPUs can lead to huge performance improvement of MPC in computationally intensive
tasks such as Privacy-Preserving Machine Learning (PPML).
A natural question to ask is whether super-linear performance
gain is possible for a linear increase in resources. In this paper,
we give an affirmative answer to this question.
We propose Force, an extremely efficient 4PC system for
PPML. To the best of our knowledge, each party in Force
enjoys the least number of local computations and lowest data
exchanges between parties. This is achieved by introducing
a new sharing type X -share along with MPC protocols in
privacy-preserving training and inference that are semi-honest
secure with an honest-majority. Our contribution does not stop
at theory. We also propose engineering optimizations and verify
the high performance of the protocols with implementation and
experiments. By comparing the results with state-of-the-art
researches such as Cheetah, Piranha, CryptGPU and CrypTen,
we showcase that Force is sound and extremely efficient, as it
can improve the PPML performance by a factor of 2 to 1200
compared with other latest 2PC, 3PC and 4PC system

ePrint: https://eprint.iacr.org/2023/493

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .