[Resource Topic] 2025/424: Matchmaker: Fast Secure Inference across Deployment Scenarios

Welcome to the resource topic for 2025/424

Title:
Matchmaker: Fast Secure Inference across Deployment Scenarios

Authors: Neha Jawalkar, Nishanth Chandran, Divya Gupta, Rahul Sharma, Arkaprava Basu

Abstract:

Secure Two-Party Computation (2PC) enables secure inference with cryptographic guarantees that protect the privacy of the model owner and client. However, it adds significant performance overhead. In this work, we make 2PC-based secure inference efficient while considering important deployment scenarios.
We observe that the hitherto unconsidered latency of fetching keys from storage significantly impacts performance, as does network speed. We design a Linear Secret Sharing (LSS)-based system LSS^M and a Function Secret Sharing (FSS)-based system FSS^M for secure inference, optimized for small key size and communication, respectively. Notably, our highly-optimized and hardware-aware CPU-based LSS^M outperforms prior GPU-based LSS systems by up to 50\times. We then show that the best choice between LSS^M and FSS^M depends on the deployment scenario.
In fact, under certain deployments, a combination of LSS^M and FSS^M can leverage heterogeneous processing across CPU and GPU. Such protocol-system co-design lets us outperform state-of-the-art secure inference systems
by up to 21\times (geomean 3.25\times).

ePrint: https://eprint.iacr.org/2025/424

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .