A neurodynamic optimization approach to constrained sparsity maximization based on alternative objective functions

In recent years, constrained sparsity maximization problems received tremendous attention in the context of compressive sensing. Because the formulated constrained L0 norm minimization problem is NP-hard, constrained L1 norm minimization is usually used to compute approximate sparse solutions. In this paper, we introduce several alternative objective functions, such as weighted L1 norm, Laplacian, hyperbolic secant, and Gaussian functions, as approximations of the L0 norm. A one-layer recurrent neural network is applied to compute the optimal solutions to the reformulated constrained minimization problems subject to equality constraints. Simulation results in terms of time responses, phase diagrams, and tabular data are provided to demonstrate the superior performance of the proposed neurodynamic optimization approach to constrained sparsity maximization based on the problem reformulations.

[1]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[2]  Wotao Yin,et al.  Iteratively reweighted algorithms for compressive sensing , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[3]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[4]  John J. Hopfield,et al.  Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit , 1986 .

[5]  Bhaskar D. Rao,et al.  Sparse signal reconstruction from limited data using FOCUSS: a re-weighted minimum norm algorithm , 1997, IEEE Trans. Signal Process..

[6]  David L. Donoho,et al.  High-Dimensional Centrally Symmetric Polytopes with Neighborliness Proportional to Dimension , 2006, Discret. Comput. Geom..

[7]  R. Daniel Meyer,et al.  An Analysis for Unreplicated Fractional Factorials , 1986 .

[8]  Shengwei Zhang,et al.  Lagrange programming neural networks , 1992 .

[9]  Weiyu Xu,et al.  Weighted ℓ1 minimization for sparse recovery with prior information , 2009, 2009 IEEE International Symposium on Information Theory.

[10]  Qingshan Liu,et al.  A One-Layer Recurrent Neural Network for Non-smooth Convex Optimization Subject to Linear Equality Constraints , 2009, ICONIP.

[11]  Allen Gersho,et al.  Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.

[12]  Qingshan Liu,et al.  A One-Layer Recurrent Neural Network with a Discontinuous Activation Function for Linear Programming , 2008, Neural Computation.

[13]  D. Donoho,et al.  Sparse MRI: The application of compressed sensing for rapid MR imaging , 2007, Magnetic resonance in medicine.

[14]  R.G. Baraniuk,et al.  Distributed Compressed Sensing of Jointly Sparse Signals , 2005, Conference Record of the Thirty-Ninth Asilomar Conference onSignals, Systems and Computers, 2005..

[15]  R.G. Baraniuk,et al.  Compressive Sensing [Lecture Notes] , 2007, IEEE Signal Processing Magazine.

[16]  Stephen P. Boyd,et al.  Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.

[17]  Qingshan Liu,et al.  A One-Layer Recurrent Neural Network With a Discontinuous Hard-Limiting Activation Function for Quadratic Programming , 2008, IEEE Transactions on Neural Networks.

[18]  Richard G. Baraniuk,et al.  A new compressive imaging camera architecture using optical-domain compression , 2006, Electronic Imaging.

[19]  H. Maurer,et al.  On L1‐minimization in optimal control and applications to robotics , 2006 .

[20]  Qingshan Liu,et al.  A Recurrent Neural Network for Non-smooth Convex Programming Subject to Linear Equality and Bound Constraints , 2006, ICONIP.

[21]  D. Donoho For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .

[22]  Victoria Stodden,et al.  Breakdown Point of Model Selection When the Number of Variables Exceeds the Number of Observations , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[23]  I. Daubechies,et al.  Iteratively reweighted least squares minimization for sparse recovery , 2008, 0807.0575.

[24]  Mauro Forti,et al.  Generalized neural network for nonsmooth nonlinear programming problems , 2004, IEEE Transactions on Circuits and Systems I: Regular Papers.

[25]  Yinyu Ye,et al.  A note on the complexity of Lp minimization , 2011, Math. Program..