Welcome to the resource topic for 2025/2020
Title:
VerfCNN, Optimal Complexity zkSNARK for Convolutional Neural Networks
Authors: Wenjie Qu, Yanpei Guo, Yue Ying, Jiaheng Zhang
Abstract:With the widespread deployment of machine learning services, concerns about potential misconduct by service providers have emerged. Providers may deviate from their promised methodologies when delivering their services, undermining customer trust. Zero-knowledge proofs (ZKPs) offer a promising solution for customers to verify service integrity while preserving the intellectual property of the model weights. However, existing ZKP systems for convolutional neural networks (CNNs) impose significant computational overhead on the prover, hindering their practical deployment.
To address this challenge and facilitate real-world deployment of ZKPs for CNNs, we introduce VerfCNN, a novel and efficient ZKP system for CNN inference. The core innovation of VerfCNN lies in a specialized protocol for proving multi-channel convolutions, achieving optimal prover complexity that matches the I/O size of the convolution. Our design significantly reduces the prover overhead for verifiable CNN inference. Experiments on VGG-16 demonstrate that our system achieves a prover time of just 12.6 seconds, offering a 6.7× improvement over zkCNN (CCS’21).
Remarkably, VerfCNN incurs only a 10× overhead compared to plaintext inference on CPU, whereas general-purpose zkSNARKs typically impose overheads exceeding 1000×. These results underscore VerfCNN’s strong potential to enhance the integrity and transparency of real-world ML services.
ePrint: https://eprint.iacr.org/2025/2020
See all topics related to this paper.
Feel free to post resources that are related to this paper below.
Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.
For more information, see the rules for Resource Topics .