[Resource Topic] 2024/480: Folding-based zkLLM

Welcome to the resource topic for 2024/480

Title:
Folding-based zkLLM

Authors: Wilbert W

Abstract:

This paper introduces a new approach to construct zero-knowledge large language models (zkLLM) based on the Folding technique. We first review the concept of Incrementally Verifiable Computation (IVC) and compare the IVC constructions based on SNARK and Folding. Then we discuss the necessity of Non-uniform IVC (NIVC) and present several Folding schemes that support more expressive circuits, such as SuperNova, Sangria, Origami, HyperNova, and Protostar. Based on these techniques, we propose a zkLLM design that uses a RAM machine architecture with a set of opcodes. We define corresponding constraint circuits for each opcode and describe the workflows of the prover and verifier. Finally, we provide examples of opcodes to demonstrate the circuit construction methods. Our zkLLM design achieves high efficiency and expressiveness, showing great potential for practical applications.

ePrint: https://eprint.iacr.org/2024/480

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .