[Resource Topic] 2017/078: LPN Decoded

Welcome to the resource topic for 2017/078

Title:
LPN Decoded

Authors: Andre Esser, Robert Kübler, Alexander May

Abstract:

We propose new algorithms with small memory consumption for the Learning Parity with Noise (LPN) problem, both classically and quantumly. Our goal is to predict the hardness of LPN depending on both parameters, its dimension k and its noise rate \tau, as accurately as possible both in theory and practice. Therefore, we analyze our algorithms asymptotically, run experiments on medium size parameters and provide bit complexity predictions for large parameters. Our new algorithms are modifications and extensions of the simple Gaussian elimination algorithm with recent advanced techniques for decoding random linear codes. Moreover, we enhance our algorithms by the dimension reduction technique from Blum, Kalai, Wasserman. This results in a hybrid algorithm that is capable for achieving the best currently known run time for any fixed amount of memory. On the asymptotic side, we achieve significant improvements for the run time exponents, both classically and quantumly. To the best of our knowledge, we provide the first quantum algorithms for LPN. Due to the small memory consumption of our algorithms, we are able to solve for the first time LPN instances of medium size, e.g. with k=243, \tau = \frac 1 8 in only 15 days on 64 threads. Our algorithms result in bit complexity prediction that require relatively large k for small \tau. For instance for small noise LPN with \tau= \frac 1 {\sqrt k}, we predict 80-bit classical and only 64-bit quantum security for k~\geq~2048. For the common cryptographic choice k=512, \tau = \frac 1 8, we achieve with limited memory classically 97-bit and quantumly 70-bit security.

ePrint: https://eprint.iacr.org/2017/078

Talk: https://www.youtube.com/watch?v=yJPm_wKRcus

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .