A Globally Stable LPNN Model for Sparse Approximation

The objective of compressive sampling is to determine a sparse vector from an observation vector. This brief describes an analog neural method to achieve the objective. Unlike previous analog neural models which either resort to the <inline-formula> <tex-math notation="LaTeX">$\ell _{1}$ </tex-math></inline-formula>-norm approximation or are with local convergence only, the proposed method avoids any approximation of the <inline-formula> <tex-math notation="LaTeX">$\ell _{1}$ </tex-math></inline-formula>-norm term and is probably capable of leading to the optimum solution. Moreover, its computational complexity is lower than that of the other three comparison analog models. Simulation results show that the error performance of the proposed model is comparable to several state-of-the-art digital algorithms and analog models and that its convergence is faster than that of the comparison analog neural models.

[1]  Wenzhong Guo,et al.  Two Projection Neural Networks With Reduced Model Complexity for Nonlinear Programming , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[2]  Majid Mohammadi,et al.  A Projection Neural Network for the Generalized Lasso , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Xing He,et al.  A Neurodynamic Algorithm for Sparse Signal Reconstruction with Finite-Time Convergence , 2020, Circuits, Systems, and Signal Processing.

[4]  Sohrab Effati,et al.  Projection Recurrent Neural Network Model: A New Strategy to Solve Maximum Flow Problem , 2020, IEEE Transactions on Circuits and Systems II: Express Briefs.

[5]  Shiping Wen,et al.  A Continuous-Time Recurrent Neural Network for Sparse Signal Reconstruction Via ℓ1 Minimization , 2018, 2018 Eighth International Conference on Information Science and Technology (ICIST).

[6]  Anthony G. Constantinides,et al.  Lagrange Programming Neural Network for Nondifferentiable Optimization Problems in Sparse Approximation , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[7]  Qingshan Liu,et al.  $L_{1}$ -Minimization Algorithms for Sparse Signal Reconstruction Based on a Projection Neural Network , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Andrew Chi-Sing Leung,et al.  Lagrange Programming Neural Network Approach for Target Localization in Distributed MIMO Radar , 2016, IEEE Transactions on Signal Processing.

[9]  Andrew Chi-Sing Leung,et al.  Waveform Design With Unit Modulus and Spectral Shape Constraints via Lagrange Programming Neural Network , 2015, IEEE Journal of Selected Topics in Signal Processing.

[10]  Jun Wang,et al.  Low-dimensional recurrent neural network-based Kalman filter for speech enhancement , 2015, Neural Networks.

[11]  Andrew Chi-Sing Leung,et al.  Recurrent networks for compressive sampling , 2014, Neurocomputing.

[12]  Wei Xing Zheng,et al.  Discrete-Time Neural Network for Fast Solving Large Linear $L_{1}$ Estimation Problems and its Application to Image Restoration , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[13]  Andrew Chi-Sing Leung,et al.  Analysis on the Convergence Time of Dual Neural Network-Based WTA , 2012, IEEE Trans. Neural Networks Learn. Syst..

[14]  Jun Wang,et al.  Analysis and Design of a $k$ -Winners-Take-All Model With a Single State Variable and the Heaviside Step Activation Function , 2010, IEEE Transactions on Neural Networks.

[15]  Guillermo Sapiro,et al.  Sparse Representation for Computer Vision and Pattern Recognition , 2010, Proceedings of the IEEE.

[16]  Youshen Xia,et al.  A Compact Cooperative Recurrent Neural Network for Computing General Constrained $L_1$ Norm Estimators , 2009, IEEE Transactions on Signal Processing.

[17]  Richard G. Baraniuk,et al.  Sparse Coding via Thresholding and Local Competition in Neural Circuits , 2008, Neural Computation.

[18]  Qingshan Liu,et al.  A One-Layer Recurrent Neural Network With a Discontinuous Hard-Limiting Activation Function for Quadratic Programming , 2008, IEEE Transactions on Neural Networks.

[19]  Mário A. T. Figueiredo,et al.  Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems , 2007, IEEE Journal of Selected Topics in Signal Processing.

[20]  Xiaolin Hu,et al.  A Recurrent Neural Network for Solving a Class of General Variational Inequalities , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[21]  Ke Huang,et al.  Sparse Representation for Signal Classification , 2006, NIPS.

[22]  Xiaolin Hu,et al.  Solving Pseudomonotone Variational Inequalities and Pseudoconvex Optimization Problems Using the Projection Neural Network , 2006, IEEE Transactions on Neural Networks.

[23]  D. Donoho For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .

[24]  E. Candès,et al.  The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.

[25]  Yuanqing Li,et al.  Analysis of Sparse Representation and Blind Source Separation , 2004, Neural Computation.

[26]  Yee Leung,et al.  Neural networks for nonlinear and mixed complementarity problems and their applications , 2004, Neural Networks.

[27]  Jun Wang,et al.  A general projection neural network for solving monotone variational inequalities and related optimization problems , 2004, IEEE Transactions on Neural Networks.

[28]  Jun Wang,et al.  A projection neural network and its application to constrained optimization problems , 2002 .

[29]  Yoshiyasu Takefuji,et al.  Microcode optimization with neural networks , 1999, IEEE Trans. Neural Networks.

[30]  Stéphane Mallat,et al.  Matching pursuits with time-frequency dictionaries , 1993, IEEE Trans. Signal Process..

[31]  Shengwei Zhang,et al.  Lagrange programming neural networks , 1992 .

[32]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[33]  Hong Zhu,et al.  Decentralized Principal Component Analysis by Integrating Lagrange Programming Neural Networks With Alternating Direction Method of Multipliers , 2020, IEEE Access.

[34]  Hanzhen Xiao,et al.  Leader-Follower Consensus Multi-Robot Formation Control Using Neurodynamic-Optimization-Based Nonlinear Model Predictive Control , 2019, IEEE Access.

[35]  E. Candes,et al.  11-magic : Recovery of sparse signals via convex programming , 2005 .

[36]  Andrew Chi-Sing Leung,et al.  Analysis for a class of winner-take-all model , 1999, IEEE Trans. Neural Networks.

[37]  Andrzej Cichocki,et al.  Neural networks for optimization and signal processing , 1993 .

[38]  J. P. Lasalle,et al.  Stability Theory and Invariance Principles , 1976 .

[39]  G. Lin Nonlinear Programming without Computation , 2022 .