Sparse Recovery for Orthogonal Polynomial Transforms

In this paper we consider the following sparse recovery problem. We have query access to a vector $\vx \in \R^N$ such that $\vhx = \vF \vx$ is $k$-sparse (or nearly $k$-sparse) for some orthogonal transform $\vF$. The goal is to output an approximation (in an $\ell_2$ sense) to $\vhx$ in sublinear time. This problem has been well-studied in the special case that $\vF$ is the Discrete Fourier Transform (DFT), and a long line of work has resulted in sparse Fast Fourier Transforms that run in time $O(k \cdot \mathrm{polylog} N)$. However, for transforms $\vF$ other than the DFT (or closely related transforms like the Discrete Cosine Transform), the question is much less settled. In this paper we give sublinear-time algorithms---running in time $\poly(k \log(N))$---for solving the sparse recovery problem for orthogonal transforms $\vF$ that arise from orthogonal polynomials. More precisely, our algorithm works for any $\vF$ that is an orthogonal polynomial transform derived from Jacobi polynomials. The Jacobi polynomials are a large class of classical orthogonal polynomials (and include Chebyshev and Legendre polynomials as special cases), and show up extensively in applications like numerical analysis and signal processing. One caveat of our work is that we require an assumption on the sparsity structure of the sparse vector, although we note that vectors with random support have this property with high probability. Our approach is to give a very general reduction from the $k$-sparse sparse recovery problem to the $1$-sparse sparse recovery problem that holds for any flat orthogonal polynomial transform; then we solve this one-sparse recovery problem for transforms derived from Jacobi polynomials.

[1]  M. E. Muldoon,et al.  Bounds for the small real and purely imaginary zeros of Bessel and related functions , 1995 .

[2]  Xianfeng Hu,et al.  Rapidly computing sparse Legendre expansions via sparse Fourier transforms , 2015, Numerical Algorithms.

[3]  Felix Krahmer,et al.  Sparse Harmonic Transforms: A New Class of Sublinear-Time Algorithms for Learning Functions of Many Variables , 2018, Found. Comput. Math..

[4]  Oded Regev,et al.  The Restricted Isometry Property of Subsampled Fourier Matrices , 2015, SODA.

[5]  Atri Rudra,et al.  A Two-pronged Progress in Structured Dense Matrix Vector Multiplication , 2018, SODA.

[6]  Toni Volkmer,et al.  Sparse harmonic transforms II: best s-term approximation guarantees for bounded orthonormal product bases in sublinear-time , 2019, Numerische Mathematik.

[7]  William J. Tango,et al.  The circle polynomials of Zernike and their application in optics , 1977 .

[8]  J. Bourgain An Improved Estimate in the Restricted Isometry Problem , 2014 .

[9]  Dennis M. Healy,et al.  Fast Discrete Polynomial Transforms with Applications to Data Analysis for Distance Transitive Graphs , 1997, SIAM J. Comput..

[10]  Kyle Luh,et al.  Sparse Reconstruction from Hadamard Matrices: A Lower Bound , 2019, ArXiv.

[11]  Volkan Cevher,et al.  An adaptive sublinear-time block sparse fourier transform , 2017, STOC.

[12]  Mikhail Kapralov,et al.  Sample Efficient Estimation and Recovery in Sparse FFT via Isolation on Average , 2017, 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS).

[13]  Atri Rudra,et al.  Learning Compressed Transforms with Low Displacement Rank , 2018, NeurIPS.

[14]  Piotr Indyk,et al.  Sample-Optimal Fourier Sampling in Any Constant Dimension , 2014, 2014 IEEE 55th Annual Symposium on Foundations of Computer Science.

[15]  Kyle Luh,et al.  An Improved Lower Bound for Sparse Reconstruction from Subsampled Hadamard Matrices , 2019, 2019 IEEE 60th Annual Symposium on Foundations of Computer Science (FOCS).

[16]  Ce Zhang,et al.  Predicting non-small cell lung cancer prognosis by fully automated microscopic pathology image features , 2016, Nature Communications.

[17]  Piotr Indyk,et al.  Recent Developments in the Sparse Fourier Transform: A compressed Fourier transform for big data , 2014, IEEE Signal Processing Magazine.

[18]  M. Rudelson,et al.  On sparse reconstruction from Fourier and Gaussian measurements , 2008 .

[19]  Daniel Potts,et al.  Reconstruction of sparse Legendre and Gegenbauer expansions , 2016 .

[20]  Shravas Rao,et al.  Improved Lower Bounds for the Restricted Isometry Property of Subsampled Fourier Matrices , 2019, ArXiv.

[21]  James Bremer,et al.  Fast algorithms for Jacobi expansions via nonoscillatory phase functions , 2018, IMA Journal of Numerical Analysis.

[22]  Ameya Velingker,et al.  Dimension-independent Sparse Fourier Transform , 2019, SODA.

[23]  Mikhail Kapralov,et al.  Sparse fourier transform in any constant dimension with nearly-optimal sample complexity in sublinear time , 2016, STOC.

[24]  Holger Rauhut,et al.  Sparse Legendre expansions via l1-minimization , 2012, J. Approx. Theory.

[25]  Piotr Indyk,et al.  (Nearly) Sample-Optimal Sparse Fourier Transform , 2014, SODA.

[26]  Sudipto Guha,et al.  Near-optimal sparse fourier representations via sampling , 2002, STOC '02.

[27]  Tri Dao,et al.  Gaussian Quadrature for Kernel Features , 2017, NIPS.

[28]  Xue Chen,et al.  Fourier-Sparse Interpolation without a Frequency Gap , 2016, 2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS).

[29]  Chris Eliasmith,et al.  Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks , 2019, NeurIPS.

[30]  慧 廣瀬 A Mathematical Introduction to Compressive Sensing , 2015 .

[31]  Daniel Potts,et al.  Parameter estimation for exponential sums by approximate Prony method , 2010, Signal Process..

[32]  James Bremer,et al.  Fast algorithms for the multi-dimensional Jacobi polynomial transform , 2019, Applied and Computational Harmonic Analysis.

[33]  T. J. Rivlin An Introduction to the Approximation of Functions , 2003 .

[34]  Piotr Indyk,et al.  Nearly optimal sparse fourier transform , 2012, STOC '12.

[35]  Gerlind Plonka,et al.  A generalized Prony method for reconstruction of sparse sums of eigenfunctions of linear operators , 2013 .

[36]  Tapan K. Sarkar,et al.  Matrix pencil method for estimating parameters of exponentially damped/undamped sinusoids in noise , 1990, IEEE Trans. Acoust. Speech Signal Process..