[Resource Topic] 2023/503: Neural Network Quantisation for Faster Homomorphic Encryption

Welcome to the resource topic for 2023/503

Title:
Neural Network Quantisation for Faster Homomorphic Encryption

Authors: Wouter Legiest, Jan-Pieter D'Anvers, Michiel Van Beirendonck, Furkan Turan, Ingrid Verbauwhede

Abstract:

Homomorphic encryption (HE) enables calculating
on encrypted data, which makes it possible to perform privacy-
preserving neural network inference. One disadvantage of this
technique is that it is several orders of magnitudes slower than
calculation on unencrypted data. Neural networks are commonly
trained using floating-point, while most homomorphic encryption
libraries calculate on integers, thus requiring a quantisation of the
neural network. A straightforward approach would be to quantise
to large integer sizes (e.g., 32 bit) to avoid large quantisation errors.
In this work, we reduce the integer sizes of the networks, using
quantisation-aware training, to allow more efficient computations.
For the targeted MNIST architecture proposed by Badawi et al., we reduce the integer sizes by 33% without significant loss
of accuracy, while for the CIFAR architecture, we can reduce the
integer sizes by 43%. Implementing the resulting networks under
the BFV homomorphic encryption scheme using SEAL, we could
reduce the execution time of an MNIST neural network by 80%
and by 40% for a CIFAR neural network.

ePrint: https://eprint.iacr.org/2023/503

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .