[Resource Topic] 2016/170: Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

Welcome to the resource topic for 2016/170

Title:
Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

Authors: Ran Raz

Abstract:

We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x \in \{0,1\}^n was chosen uniformly at random. A learner tries to learn x from a stream of samples (a_1, b_1), (a_2, b_2)... , where each a_t is uniformly distributed over \{0,1\}^n and b_t is the inner product of a_t and x, modulo 2. We show that any algorithm for parity learning, that uses less than n^2/25 bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is O(n) (where n is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length n, as well as time complexity of n per encryption/decryption of each bit, and is provenly and unconditionally secure as long as the attacker uses less than n^2/25 memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decryption.

ePrint: https://eprint.iacr.org/2016/170

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .