Sparse synthesis regularization with deep neural networks

We propose a sparse reconstruction framework for solving inverse problems. Opposed to existing sparse regularization techniques that are based on frame representations, we train an encoder-decoder network by including an ℓ1-penalty. We demonstrate that the trained decoder network allows sparse signal reconstruction using thresholded encoded coefficients without losing much quality of the original image. Using the sparse synthesis prior, we propose minimizing the ℓ1-Tikhonov functional, which is the sum of a data fitting term and the ℓ1-norm of the synthesis coefficients, and show that it provides a regularization method.

[1]  Chun-Liang Li,et al.  One Network to Solve Them All — Solving Linear Inverse Problems Using Deep Projection Models , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[2]  Jonas Adler,et al.  Solving ill-posed inverse problems using iterative deep neural networks , 2017, ArXiv.

[3]  Thomas Brox,et al.  U-Net: Convolutional Networks for Biomedical Image Segmentation , 2015, MICCAI.

[4]  E. Candès,et al.  Recovering edges in ill-posed inverse problems: optimality of curvelet frames , 2002 .

[5]  Jong Chul Ye,et al.  Deep residual learning for compressed sensing MRI , 2017, 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017).

[6]  Stephan Antholzer,et al.  NETT: solving inverse problems with deep neural networks , 2018, Inverse Problems.

[7]  Markus Haltmeier Stable Signal Reconstruction via $\ell^1$-Minimization in Redundant, Non-Tight Frames , 2013, IEEE Transactions on Signal Processing.

[8]  Jong Chul Ye,et al.  Framing U-Net via Deep Convolutional Framelets: Application to Sparse-View CT , 2017, IEEE Transactions on Medical Imaging.

[9]  I. Daubechies,et al.  An iterative thresholding algorithm for linear inverse problems with a sparsity constraint , 2003, math/0307152.

[10]  Rémi Gribonval,et al.  Dictionary Identification - Sparse Matrix-Factorisation via ℓ _ 1 -Minimisation , 2020 .

[11]  Michael Unser,et al.  Deep Convolutional Neural Network for Inverse Problems in Imaging , 2016, IEEE Transactions on Image Processing.

[12]  Karin Schnass,et al.  Dictionary Identification—Sparse Matrix-Factorization via $\ell_1$ -Minimization , 2009, IEEE Transactions on Information Theory.

[13]  A. Bruckstein,et al.  K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation , 2005 .

[14]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[15]  O. Scherzer,et al.  Necessary and sufficient conditions for linear convergence of ℓ1‐regularization , 2011 .

[16]  Stephan Antholzer,et al.  Deep null space learning for inverse problems: convergence analysis and rates , 2018, Inverse Problems.

[17]  Otmar Scherzer,et al.  Variational Methods in Imaging , 2008, Applied mathematical sciences.

[18]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[19]  Stephan Antholzer,et al.  Deep learning for photoacoustic tomography from sparse data , 2017, Inverse problems in science and engineering.

[20]  Carola-Bibiane Schönlieb,et al.  Adversarial Regularizers in Inverse Problems , 2018, NeurIPS.

[21]  Thomas Pock,et al.  Variational Networks: Connecting Variational Methods and Deep Learning , 2017, GCPR.

[22]  M. Elad,et al.  $rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation , 2006, IEEE Transactions on Signal Processing.