A Simple Derivation of AMP and its State Evolution via First-Order Cancellation

We consider the linear regression problem, where the goal is to recover the vector x ∈ ℝn from measurements y = Ax+w ∈ ℝm under known matrix A and unknown noise w. For large i.i.d. sub-Gaussian A, the approximate message passing (AMP) algorithm is precisely analyzable through a state-evolution (SE) formalism, which furthermore shows that AMP is Bayes optimal in certain regimes. The rigorous SE proof, however, is long and complicated. And, although the AMP algorithm can be derived as an approximation of loop belief propagation (LBP), this viewpoint provides little insight into why large i.i.d. A matrices are important for AMP, and why AMP has a state evolution. In this work, we provide a heuristic derivation of AMP and its state evolution, based on the idea of "firstorder cancellation," that provides insights missing from the LBP derivation while being much shorter than the rigorous SE proof.

[1]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, 2010 IEEE International Symposium on Information Theory.

[2]  Galen Reeves,et al.  The replica-symmetric prediction for compressed sensing with Gaussian matrices is exact , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[3]  Ramji Venkataramanan,et al.  Finite-sample analysis of Approximate Message Passing , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[4]  Philip Schniter A Simple Derivation of AMP and its State Evolution via First-Order Cancellation , 2020, IEEE Transactions on Signal Processing.

[5]  Jianhua Lu,et al.  An Expectation Propagation Perspective on Approximate Message Passing , 2015, IEEE Signal Processing Letters.

[6]  William T. Freeman,et al.  Constructing free-energy approximations and generalized belief propagation algorithms , 2005, IEEE Transactions on Information Theory.

[7]  Nicolas Macris,et al.  The mutual information in random linear estimation , 2016, 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[8]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[9]  Antonin Chambolle,et al.  Nonlinear wavelet image processing: variational problems, compression, and noise removal through wavelet shrinkage , 1998, IEEE Trans. Image Process..

[10]  W. Wiegerinck,et al.  Approximate inference techniques with expectation constraints , 2005 .

[11]  Andrea Montanari,et al.  Graphical Models Concepts in Compressed Sensing , 2010, Compressed Sensing.

[12]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[13]  Andrea Montanari,et al.  Message passing algorithms for compressed sensing: I. motivation and construction , 2009, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[14]  Ramji Venkataramanan,et al.  Finite Sample Analysis of Approximate Message Passing Algorithms , 2016, IEEE Transactions on Information Theory.

[15]  Andrea Montanari,et al.  Universality in Polytope Phase Transitions and Message Passing Algorithms , 2012, ArXiv.