Comments on Why Generalized BP Serves So Remarkably in 2-D Channels

Generalized belief propagation (GBP) algorithm has been shown recently to infer the a-posteriori probabilities of finite-state input two-dimensional (2D) Gaussian channels with memory in a practically accurate manner, thus enabling near-optimal estimation of the transmitted symbols and the Shannon-theoretic information rates. In this note, a rationalization of this excellent performance of GBP is addressed.