Diffusion Sparse Least-Mean Squares Over Networks

We address the problem of in-network distributed estimation for sparse vectors. In order to exploit the underlying sparsity of the vector of interest, we incorporate the ℓ1- and ℓ0-norm constraints into the cost function of the standard diffusion least-mean squares (LMS). This technique is equivalent to adding a zero-attracting term in the iteration of the LMS-based algorithm, which accelerates the convergence rates of the zero or near-zero components. The rules for selecting the intensity of the zero-attracting term are derived and verified. Simulation results show that the performances of the proposed schemes depend on the degree of sparsity. Provided that suitable intensities of the zero-attracting term are selected, they can outperform the standard diffusion LMS when the considered vector is sparse. In addition, a practical application of the proposed sparse algorithms in spectrum estimation for a narrow-band source is presented.

[1]  S. Thomas Alexander,et al.  Adaptive Signal Processing , 1986, Texts and Monographs in Computer Science.

[2]  Petre Stoica,et al.  On biased estimators and the unbiased Cramér-Rao lower bound , 1990, Signal Process..

[3]  Elie Bienenstock,et al.  Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.

[4]  Milica Stojanovic,et al.  Sparse equalization for real-time digital underwater acoustic communications , 1995, 'Challenges of Our Changing Global Environment'. Conference Proceedings. OCEANS '95 MTS/IEEE.

[5]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[6]  Paul S. Bradley,et al.  Feature Selection via Concave Minimization and Support Vector Machines , 1998, ICML.

[7]  C. Richard Johnson,et al.  Exploiting sparsity in adaptive filters , 2002, IEEE Trans. Signal Process..

[8]  Danilo P. Mandic,et al.  A generalized normalized gradient descent algorithm , 2004, IEEE Signal Processing Letters.

[9]  Petre Stoica,et al.  Spectral Analysis of Signals , 2009 .

[10]  Ali H. Sayed,et al.  Incremental Adaptive Strategies Over Distributed Networks , 2007, IEEE Transactions on Signal Processing.

[11]  Ali H. Sayed,et al.  Adaptive Processing over Distributed Networks , 2007, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[12]  Robert D. Nowak,et al.  Compressed channel sensing , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.

[13]  Ali H. Sayed,et al.  Diffusion Least-Mean Squares Over Adaptive Networks: Formulation and Performance Analysis , 2008, IEEE Transactions on Signal Processing.

[14]  Yuantao Gu,et al.  $l_{0}$ Norm Constraint LMS Algorithm for Sparse System Identification , 2009, IEEE Signal Processing Letters.

[15]  Ioannis D. Schizas,et al.  Distributed LMS for Consensus-Based In-Network Adaptive Processing , 2009, IEEE Transactions on Signal Processing.

[16]  Peng Shi,et al.  Convergence analysis of sparse LMS algorithms with l1-norm penalty based on white input signal , 2010, Signal Process..

[17]  Ali H. Sayed,et al.  Diffusion LMS Strategies for Distributed Estimation , 2010, IEEE Transactions on Signal Processing.

[18]  Georgios B. Giannakis,et al.  Distributed Spectrum Sensing for Cognitive Radio Networks by Exploiting Sparsity , 2010, IEEE Transactions on Signal Processing.

[19]  Aníbal R. Figueiras-Vidal,et al.  Adaptively Biasing the Weights of Adaptive Filters , 2010, IEEE Transactions on Signal Processing.

[20]  Gonzalo Mateos,et al.  Distributed Sparse Linear Regression , 2010, IEEE Transactions on Signal Processing.

[21]  Danilo P. Mandic,et al.  Characterisation of Signal Modality: Exploiting Signal Nonlinearity in Machine Learning and Signal Processing , 2010, J. Signal Process. Syst..

[22]  F. Y. Wu,et al.  Non-Uniform Norm Constraint LMS Algorithm for Sparse System Identification , 2013, IEEE Communications Letters.