Identification of Kernel Regression Model Using Double Norms Method

Considering that conventional nonlinear dynamic system modeling approaches from a finite set of measured data is not prone to control the model structure complexity, and those methods can also lead to over-fitting problems, a new method based on double norms is proposed to identify kernel regression model which is guaranteed by identifying accuracy and model sparsity . From the point of view of dominating the model structure complexity and improving the identification accuracy, the proposed method combines L1-norm structural risk minimization theory with some ideas from L∞-norm on approximation error minimization to construct the optimization problem of kernel regression model. Following that, the optimization can be solved by the simpler linear programming. The method has the following remarkable features: 1) identifying accuracy can be guaranteed by the L1-norm minimization on approximation error; 2)model structural complexity is under control by introducing L1-norm on structural risk within the framework of support vector regression (SVR) to guarantee the model sparsity; 3)the optimality of the proposed method adopts the equilibrium between the identifying accuracy and sparseness. Finally, rationalities and superiorities of the proposed method in identifying nonlinear dynamic system is demonstrated by experiments.

[1]  Michael Hanss,et al.  Fuzzy linear least squares for the identification of possibilistic regression models , 2019, Fuzzy Sets Syst..

[2]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[3]  Yong-Ping Zhao,et al.  Multikernel semiparametric linear programming support vector regression , 2011, Expert Syst. Appl..

[4]  B. Schölkopf,et al.  Linear programs for automatic accuracy control in regression. , 1999 .

[5]  Bart De Moor,et al.  Unbiased minimum-variance input and state estimation for linear discrete-time systems , 2007, Autom..

[6]  Robert F. Harrison,et al.  A new method for sparsity control in support vector classification and regression , 2001, Pattern Recognit..

[7]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[8]  Giuseppe De Nicolao,et al.  A new kernel-based approach for linear system identification , 2010, Autom..

[9]  Louis A. Liporace,et al.  Maximum likelihood estimation for multivariate observations of Markov sources , 1982, IEEE Trans. Inf. Theory.

[10]  Michael E. Tipping Bayesian Inference: An Introduction to Principles and Practice in Machine Learning , 2003, Advanced Lectures on Machine Learning.

[11]  Huajing Fang,et al.  A novel cost function based on decomposing least-square support vector machine for Takagi-Sugeno fuzzy system identification , 2014 .

[12]  R. Tóth,et al.  Bayesian Frequency Domain Identification of LTI Systems with OBFs kernels , 2017 .

[13]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[14]  D. Basak,et al.  Support Vector Regression , 2008 .

[15]  Guilherme A. Barreto,et al.  Online sparse correntropy kernel learning for outlier-robust system identification , 2019 .

[16]  Alessandro Chiuso,et al.  A New Kernel-Based Approach for NonlinearSystem Identification , 2011, IEEE Transactions on Automatic Control.

[17]  Tomaso A. Poggio,et al.  Regularization Networks and Support Vector Machines , 2000, Adv. Comput. Math..

[18]  Jun Ma,et al.  Feed-forward neural network training using sparse representation , 2019, Expert Syst. Appl..