RLS sparse system identification using LAR-based situational awareness

In this paper we propose the combination of the recursive least squares (RLS) and the least angle regression (LAR) algorithms for nonlinear system identification. In the application of interest, the model possesses a large number of coefficients, of which only few are different from zero. We use the LAR algorithm together with a geometrical stopping criterion to establish the number and position of the coefficients to be estimated by the RLS algorithm. The output error is used for indicating model inadequacy and therefore triggering the LAR algorithm. The proposed scheme is capable of modeling intrinsically sparse systems with better accuracy than the RLS algorithm alone, and lower energy consumption.

[1]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[2]  Lars Kai Hansen,et al.  Approximate L0 constrained non-negative matrix and tensor factorization , 2008, 2008 IEEE International Symposium on Circuits and Systems.

[3]  Xin Li,et al.  Finding deterministic solution from underdetermined equation: Large-scale performance modeling by least angle regression , 2009, 2009 46th ACM/IEEE Design Automation Conference.

[4]  H. Akaike A new look at the statistical model identification , 1974 .

[5]  Georgios B. Giannakis,et al.  Sparse Volterra and Polynomial Regression Models: Recoverability and Estimation , 2011, IEEE Transactions on Signal Processing.

[6]  Ashutosh Kumar Singh,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2010 .

[7]  Yaakov Tsaig,et al.  Fast Solution of $\ell _{1}$ -Norm Minimization Problems When the Solution May Be Sparse , 2008, IEEE Transactions on Information Theory.

[8]  José Antonio Apolinário,et al.  A geometrical stopping criterion for the LAR algorithm , 2012, 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO).

[9]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[10]  S. Sathiya Keerthi,et al.  Generalized LARS as an effective feature selection tool for text classification with SVMs , 2005, ICML.

[11]  R. Tibshirani,et al.  Forward stagewise regression and the monotone lasso , 2007, 0705.0269.

[12]  D. Donoho,et al.  Fast Solution of -Norm Minimization Problems When the Solution May Be Sparse , 2008 .

[13]  Paulo S. R. Diniz,et al.  Adaptive Filtering: Algorithms and Practical Implementation , 1997 .

[14]  Cuntao Xiao,et al.  Two-dimensional sparse principal component analysis for face recognition , 2010, 2010 2nd International Conference on Future Computer and Communication.

[15]  Jafar A. Khan,et al.  Robust Linear Model Selection Based on Least Angle Regression , 2007 .

[16]  Alexandru Onose,et al.  Greedy Sparse RLS , 2012, IEEE Transactions on Signal Processing.

[17]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[18]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[19]  Mingwen Wang,et al.  Two-Stage Feature Selection Method for Text Classification , 2009, 2009 International Conference on Multimedia Information Networking and Security.