[Resource Topic] 2025/823: Sampling Arbitrary Discrete Distributions for RV Commitment Schemes Using the Trimmed-Tree Knuth-Yao Algorithm

Welcome to the resource topic for 2025/823

Title:
Sampling Arbitrary Discrete Distributions for RV Commitment Schemes Using the Trimmed-Tree Knuth-Yao Algorithm

Authors: Zoë Ruha Bell, Anvith Thudi

Abstract:

Sampling from non-uniform randomness according to an algorithm which keeps the internal randomness used by the sampler hidden is increasingly important for cryptographic applications, such as timing-attack-resistant lattice-based cryptography or certified differential privacy. In this paper we present a provably efficient sampler that maintains random sample privacy, or random sample hiding, and is applicable to arbitrary discrete random variables. Namely, we present a constant-time version of the classic Knuth-Yao algorithm that we name “trimmed-tree” Knuth-Yao. We establish distribution-tailored Boolean circuit complexity bounds for this algorithm, in contrast to the previous naive distribution-agnostic bounds. For a \sigma^2-sub-Gaussian discrete distribution where b_t is the number of bits for representing the domain, and b_p is the bits for precision of the PDF values, we prove the Boolean circuit complexity of the trimmed-tree Knuth-Yao algorithm has upper bound O(\sigma b_p^{3/2} b_t), an exponential improvement over the naive bounds, and in certain parameter regimes establish the lower bound \widetilde{\Omega}( ( \sigma + b_p ) b_t ). Moreover, by proving the subtrees in the trimmed-tree Knuth-Yao circuit are small, we prove it can computed by running b_p circuits of size O(\sigma b_p^{1/2} b_t) in parallel and then running O(b_p b_t ) sequential operations on the output. We apply these circuits for trimmed-tree Knuth-Yao to constructing random variable commitment schemes for arbitrary discrete distributions, giving exponential improvements in the number of random bits and circuit complexity used for certified differentially private means and counting queries over large datasets and domains.

ePrint: https://eprint.iacr.org/2025/823

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .