[Resource Topic] 2016/1116: Evaluating Entropy for TRNGs: Efficient, Robust and Provably Secure

Welcome to the resource topic for 2016/1116

Title:
Evaluating Entropy for TRNGs: Efficient, Robust and Provably Secure

Authors: Maciej Skorski

Abstract:

Estimating entropy of randomness sources is a task of crit- ical importance in the context of true random number generators, as feeding cryptographic applications with insufficient entropy is a serious real-world security risk. The challenge is to maximize accuracy and con- fidence under certain data models and resources constants. In this paper we analyze the performance of a simple collision-counting estimator, under the assumption that source outputs are independent but their distribution can change due to adversarial influences. For n samples and confidence 1 − we achieve the following features (a) Efficiency: reads the stream in one-pass and uses constant memory (forward-only mode) (b) Accuracy: estimates the amount of extractable bits with a relative 1 error O(n − 2 log(1/ε)), when the source outputs are i.i.d. (c) Robustness: keeps the same error when the source outputs are inde- 1 pendent but the distribution changes up to t = O(n^0.5) times during runtime We demonstrate that the estimator is accurate enough to adjust post- processing components dynamically, estimating entropy on the fly in- stead investigating it off-line. Our work thus continues the line of re- search on “testable random number generators” originated by Bucii and Luzzi at CHES’05.

ePrint: https://eprint.iacr.org/2016/1116

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .