[Resource Topic] 2019/766: Complexity of Estimating Renyi Entropy of Markov Chains

Welcome to the resource topic for 2019/766

Title:
Complexity of Estimating Renyi Entropy of Markov Chains

Authors: Maciej Obremski, Maciej Skorski

Abstract:

Estimating entropy of random processes is one of the fundamental problems of machine learning and property testing. It has numerous applications to anything from DNA testing and predictability of human behaviour to modeling neural activity and cryptography. We investigate the problem of Renyi entropy estimation for sources that form Markov chains. Kamath and Verdú (ISIT’16) showed that good mixing properties are essential for that task. We show that even with very good mixing time, estimation of min-entropy requires \Omega(K^2) sample size, while collision entropy requires \Omega(K^{3/2}) samples, where K is the size of the alphabet. Our results hold both in asymptotic and non-asymptotic regimes. We achieve the results by applying Le Cam’s method to two Markov chains which differ by an appropriately chosen sparse perturbation; the discrepancy between these chains is estimated with help of perturbation theory. Our techniques might be of independent interest.

ePrint: https://eprint.iacr.org/2019/766

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .