Laplace-distributed increments, the Laplace prior, and edge-preserving regularization

For a given two-dimensional image, we define the horizontal and vertical increments at a pixel location to be the difference between the intensity values at that pixel and at the neighboring pixels to the right and above, respectively. For a typical image, it makes intuitive sense that the increments will usually be near zero, corresponding to areas of smooth variation in image intensity, but will often have large magnitude, corresponding to edges where sharp intensity changes occur. In this paper, we explore the use of the Laplace increment model, in which the increments are assumed to be independent and identically distributed Laplace random variables – a distribution with heavy tails allowing for large increment values – with zero mean. The prior constructed from the Laplace increment model is very similar to the total variation (TV) prior. We perform a theoretical analysis of its properties, which shows that the Laplace prior yields a regularization scheme with regularized solutions contained in the space of bounded variation, just as for the TV prior. Moreover, numerical experiments indicate that the Laplace prior yields reconstructions that are qualitatively very similar to those obtained using TV.

[1]  K. Abromeit Music Received , 2023, Notes.

[2]  J. Besag Spatial Interaction and the Statistical Analysis of Lattice Systems , 1974 .

[3]  C. Vogel Computational Methods for Inverse Problems , 1987 .

[4]  L. Evans Measure theory and fine properties of functions , 1992 .

[5]  C. Vogel,et al.  Analysis of bounded variation penalty methods for ill-posed problems , 1994 .

[6]  V. Hardman Author Information , 2021, Disability and Health Journal.

[7]  H. Engl,et al.  Regularization of Inverse Problems , 1996 .

[8]  J. M. Ollinger,et al.  Positron Emission Tomography , 2018, Handbook of Small Animal Imaging.

[9]  Curtis R. Vogel,et al.  Ieee Transactions on Image Processing Fast, Robust Total Variation{based Reconstruction of Noisy, Blurred Images , 2022 .

[10]  Curtis R. Vogel,et al.  A Fast, Robust Algorithm for Total Variation Based Reconstruction of Noisy, Blurred Images , 1998 .

[11]  David Mumford,et al.  Statistics of natural images and models , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[12]  Frank Natterer,et al.  Mathematical methods in image reconstruction , 2001, SIAM monographs on mathematical modeling and computation.

[13]  Johnathan M. Bardsley,et al.  A Nonnegatively Constrained Convex Programming Method for Image Reconstruction , 2003, SIAM J. Sci. Comput..

[14]  Leonhard Held,et al.  Gaussian Markov Random Fields: Theory and Applications , 2005 .

[15]  Y. Mukaigawa,et al.  Large Deviations Estimates for Some Non-local Equations I. Fast Decaying Kernels and Explicit Bounds , 2022 .

[16]  Faming Liang,et al.  Statistical and Computational Inverse Problems , 2006, Technometrics.

[17]  PROCEssIng magazInE IEEE Signal Processing Magazine , 2004 .

[18]  Johnathan M. Bardsley,et al.  Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation , 2009 .

[19]  Aaron Luttman,et al.  Total variation-penalized Poisson likelihood estimation for ill-posed problems , 2009, Adv. Comput. Math..

[20]  Johnathan M. Bardsley,et al.  An Iterative Method for Edge-Preserving MAP Estimation When Data-Noise Is Poisson , 2010, SIAM J. Sci. Comput..

[21]  Johnathan M. Bardsley,et al.  Hierarchical regularization for edge-preserving reconstruction of PET images , 2010 .

[22]  Johnathan M. Bardsley,et al.  A theoretical framework for the regularization of Poisson likelihood estimation problems , 2010 .

[23]  P. Hansen Discrete Inverse Problems: Insight and Algorithms , 2010 .