Welcome to the resource topic for 2019/338
Title:
Garbled Neural Networks are Practical
Authors: Marshall Ball, Brent Carmer, Tal Malkin, Mike Rosulek, Nichole Schimanski
Abstract:We show that garbled circuits are a practical choice for secure evaluation of neural network classifiers. At the protocol level, we start with the garbling scheme of Ball, Malkin & Rosulek (ACM CCS 2016) for arithmetic circuits and introduce new optimizations for modern neural network activation functions. We develop fancy-garbling, the first implementation of the BMR16 garbling scheme along with our new optimizations, as part of heavily optimized garbled-circuits tool that is driven by a TensorFlow classifier description. We evaluate our constructions on a wide range of neural networks. We find that our approach is up to 100x more efficient than straight-forward boolean garbling (depending on the neural network). Our approach is also roughly 40% more efficient than DeepSecure (Rouhani et al., DAC 2018), the only previous garbled-circuit-based approach for secure neural network evaluation, which incorporates significant optimization techniques for boolean circuits. Furthermore, our approach is competitive with other non-garbled-circuit approaches for secure neural network evaluation.
ePrint: https://eprint.iacr.org/2019/338
See all topics related to this paper.
Feel free to post resources that are related to this paper below.
Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.
For more information, see the rules for Resource Topics .