[Resource Topic] 2024/1812: Batching Adaptively-Sound SNARGs for NP

Welcome to the resource topic for 2024/1812

Title:
Batching Adaptively-Sound SNARGs for NP

Authors: Lalita Devadas, Brent Waters, David J. Wu

Abstract:

A succinct non-interactive argument (SNARG) for NP allows a prover to convince a verifier that an NP statement x is true with a proof whose size is sublinear in the length of the traditional NP witness. Moreover, a SNARG is adaptively sound if the adversary can choose the statement it wants to prove after seeing the scheme parameters. Very recently, Waters and Wu (STOC 2024) showed how to construct adaptively-sound SNARGs for NP in the plain model from falsifiable assumptions (specifically, sub-exponentially-secure indistinguishability obfuscation, sub-exponentially-secure one-way functions, and polynomial hardness of discrete log).

We consider the batch setting where the prover wants to prove a collection of T statements x_1, \ldots, x_T and its goal is to construct a proof whose size is sublinear in both the size of a single witness and the number of instances T. In this setting, existing constructions either require the size of the public parameters to scale linearly with T (and thus, can only support an a priori bounded number of instances), or only provide non-adaptive soundness, or have proof size that scales linearly with the size of a single NP witness. In this work, we give two approaches for batching adaptively-sound SNARGs for NP, and in particular, show that under the same set of assumptions as those underlying the Waters-Wu adaptively-sound SNARG, we can obtain an adaptively-sound SNARG for batch NP where the size of the proof is \mathsf{poly}(\lambda) and the size of the CRS is \mathsf{poly}(\lambda + |C|), where \lambda is a security parameter and |C| is the size of the circuit that computes the associated NP relation.

Our first approach builds directly on top of the Waters-Wu construction and relies on indistinguishability obfuscation and a homomorphic re-randomizable one-way function. Our second approach shows how to combine ideas from the Waters-Wu SNARG with the chaining-based approach by Garg, Sheridan, Waters, and Wu (TCC 2022) to obtain a SNARG for batch NP.

ePrint: https://eprint.iacr.org/2024/1812

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .