Welcome to the resource topic for 2025/1534
Title:
RBOOT: Accelerating Homomorphic Neural Network Inference by Fusing ReLU within Bootstrapping
Authors: Zhaomin Yang, Chao Niu, Benqiang Wei, Zhicong Huang, Cheng Hong, Tao Wei
Abstract:A major bottleneck in secure neural network inference using Fully Homomorphic Encryption (FHE) is the evaluation of non-linear activation functions like ReLU, which are inefficient to compute under FHE. State-of-the-art solutions approximate ReLU using high-degree polynomials, incurring significant computational overhead. We propose novel methods for functional bootstrapping with CKKS, and based on these methods we present RBOOT, an optimized framework that seamlessly integrates ReLU evaluation into CKKS bootstrapping, significantly reducing multiplication depth and boosting efficiency. Our key insight is that the EvalMod step in CKKS bootstrapping is composed of trigonometric functions, which can be transformed into various common non-linear functions. By co-optimizing these components, we can exploit such non-linearity to construct ReLU (and other non-linear functions) within the bootstrapping process itself, greatly reducing the computation overhead. Results on four widely used CNN models show that RBOOT achieves 2.77\times faster end-to-end inference and 81\% lower memory usage compared to previous polynomial approximation works, while maintaining comparable accuracy.
ePrint: https://eprint.iacr.org/2025/1534
See all topics related to this paper.
Feel free to post resources that are related to this paper below.
Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.
For more information, see the rules for Resource Topics .