Welcome to the resource topic for 2024/2008
Title:
PrivCirNet: Efficient Private Inference via Block Circulant Transformation
Authors: Tianshi Xu, Lemeng Wu, Runsheng Wang, Meng Li
Abstract:Homomorphic encryption (HE)-based deep neural network (DNN) inference protects data and model privacy but suffers from significant computation overhead. We observe transforming the DNN weights into circulant matrices converts general matrix-vector multiplications into HE-friendly 1-dimensional convolutions, drastically reducing the HE computation cost. Hence, in this paper, we propose PrivCirNet, a protocol/network co-optimization framework based on block circulant transformation. At the protocol level, PrivCirNet customizes the HE encoding algorithm that is fully compatible with the block circulant transformation and reduces the computation latency in proportion to the block size. At the network level, we propose a latency-aware formulation to search for the layer-wise block size assignment based on second-order information. PrivCirNet also leverages layer fusion to further reduce the inference cost. We compare PrivCirNet with the state-of-the-art HE-based framework Bolt (IEEE S&P 2024) and HE-friendly pruning method SpENCNN (ICML 2023). For ResNet-18 and Vision Transformer (ViT) on Tiny ImageNet, PrivCirNet reduces latency by 5.0\times and 1.3\times with iso-accuracy over Bolt, respectively, and improves accuracy by 4.1\% and 12\% over SpENCNN, respectively. For MobileNetV2 on ImageNet, PrivCirNet achieves 1.7\times lower latency and 4.2\% better accuracy over Bolt and SpENCNN, respectively.
Our code and checkpoints are available at GitHub - Tianshi-Xu/PrivCirNet: [NeurIPS'24] Official implement of "PrivCirNet: Efficient Private Inference via Block Circulant Transformation".
ePrint: https://eprint.iacr.org/2024/2008
See all topics related to this paper.
Feel free to post resources that are related to this paper below.
Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.
For more information, see the rules for Resource Topics .