Restricted Isometry of Fourier Matrices and List Decodability of Random Linear Codes

@q, with probability arbitrarily close to 1, is list decodable at radius 1--1/q -- e with list size L = O(1/e2) and rate R = Ωq(e2/(log3(1/e))). Up to the polylogarithmic factor in 1/e and constant factors depending on q, this matches the lower bound L = Ωq(1/e2) for the list size and upper bound R = Oq(e2) for the rate. Previously only existence (and not abundance) of such codes was known for the special case q = 2 (Guruswami, Hastad, Sudan and Zuckerman, 2002). In order to obtain our result, we employ a relaxed version of the well known Johnson bound on list decoding that translates the average Hamming distance between codewords to list decoding guarantees. We furthermore prove that the desired average-distance guarantees hold for a code provided that a natural complex matrix encoding the codewords satisfies the Restricted Isometry Property with respect to the Euclidean norm (RIP-2). For the case of random binary linear codes, this matrix coincides with a random submatrix of the Hadamard-Walsh transform matrix that is well studied in the compressed sensing literature. Finally we improve the analysis of Rudelson and Vershynin (2008) on the number of random frequency samples required for exact reconstruction of k-sparse signals of length N. Specifically we improve the number of samples from O(k log (N) log2 (k)(log k+log log N)) to O(k log(N) log3(k)). The proof involves bounding the expected supremum of a related Gaussian process by using an improved analysis of the metric defined by the process. This improvement is crucial for our application in list decoding.

[1]  J. Kuelbs Probability on Banach spaces , 1978 .

[2]  Ravi Kumar,et al.  Proofs, codes, and polynomial-time reducibilities , 1999, Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317).

[3]  Leonid A. Levin,et al.  A hard-core predicate for all one-way functions , 1989, STOC '89.

[4]  Vladimir M. Blinovsky,et al.  Code bounds for multiple packings over a nonbinary finite alphabet , 2005, Probl. Inf. Transm..

[5]  Luca Trevisan,et al.  Extractors and pseudorandom generators , 2001, JACM.

[6]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[7]  Venkatesan Guruswami,et al.  A Lower Bound on List Size for List Decoding , 2005, IEEE Trans. Inf. Theory.

[8]  Nir Ailon,et al.  An almost optimal unrestricted fast Johnson-Lindenstrauss transform , 2010, SODA '11.

[9]  Venkatesan Guruswami,et al.  Combinatorial bounds for list decoding , 2002, IEEE Trans. Inf. Theory.

[10]  Venkatesan Guruswami,et al.  Combinatorial limitations of a strong form of list decoding , 2012, Electron. Colloquium Comput. Complex..

[11]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[12]  Peter Elias,et al.  Error-correcting codes for list decoding , 1991, IEEE Trans. Inf. Theory.

[13]  Luca Trevisan,et al.  Pseudorandom generators without the XOR Lemma , 1999, Electron. Colloquium Comput. Complex..

[14]  B. Carl Inequalities of Bernstein-Jackson-type and the degree of compactness of operators in Banach spaces , 1985 .

[15]  Prasad Raghavendra,et al.  List decoding tensor products and interleaved codes , 2008, STOC '09.

[16]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[17]  Shachar Lovett,et al.  List decoding Reed-Muller codes over small fields , 2014, Electron. Colloquium Comput. Complex..

[18]  Venkatesan Guruswami,et al.  On the List-Decodability of Random Linear Codes , 2010, IEEE Transactions on Information Theory.

[19]  Vladimir M. Blinovsky,et al.  On the convexity of one coding-theory function , 2008, Probl. Inf. Transm..

[20]  R. DeVore,et al.  A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .

[21]  Venkatesan Guruswami,et al.  Extensions to the Johnson bound , 2001 .

[22]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[23]  Venkatesan Guruswami,et al.  Codes for Computationally Simple Channels: Explicit Constructions with Optimal Rate , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[24]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[25]  Venkatesan Guruswami,et al.  Restricted Isometry of Fourier Matrices and List Decodability of Random Linear Codes , 2013, SIAM J. Comput..

[26]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[27]  Atri Rudra Limits to List Decoding Random Codes , 2009, COCOON.

[28]  Elchanan Mossel,et al.  On the complexity of approximating the VC dimension , 2001, Proceedings 16th Annual IEEE Conference on Computational Complexity.

[29]  M. Rudelson,et al.  On sparse reconstruction from Fourier and Gaussian measurements , 2008 .

[30]  Mahdi Cheraghchi Coding-theoretic methods for sparse recovery , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[31]  Mary Wootters,et al.  On the list decodability of random linear codes with large error rates , 2013, STOC '13.

[32]  Rachel Ward,et al.  New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property , 2010, SIAM J. Math. Anal..

[33]  Mahdi Cheraghchi,et al.  Applications of Derandomization Theory in Coding , 2011, ArXiv.