Practical Considerations for Use of Causation Entropy in Sparsity Identification

The selection of model structure is an important step in system identification for nonlinear systems in cases where the model form is not known a priori. This process, sometimes called covariate selection or sparsity identification, involves the selection of terms in the dynamic model and is performed prior to parameter estimation. Previous work has shown the applicability of an information theory quantity known as causation entropy in performing sparsity identification. While prior work established the overall feasibility of using causation entropy to eliminate extraneous terms in a model, key questions remained regarding practical implementation. This paper builds on previous work to explore key practical considerations of causation entropy sparsity identification. First, the effect of data size is explored through both analysis and simulation, and general guidance is provided on how much data is necessary to produce accurate causation entropy estimates. Second, the effects of measurement noise and model discretization error are investigated, showing that both cause degradation of the causation entropy estimation accuracy but in opposite ways. These practical effects and trends are illustrated on several example nonlinear systems. Overall, results show that the causation entropy approach is a practical technique for sparsity identification particularly in light of the guidelines presented here for data size selection and handling of error sources.

[1]  Jonathan W. Pillow,et al.  Bayesian entropy estimation for countable discrete distributions , 2013, J. Mach. Learn. Res..

[2]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[3]  R. Tibshirani,et al.  The solution path of the generalized lasso , 2010, 1005.1971.

[4]  Jonathan Rogers,et al.  Experimental Investigation of Real-Time Helicopter Weight Estimation , 2014 .

[5]  Tegoeh Tjahjowidodo,et al.  Identification of pre-sliding and sliding friction dynamics: Grey box and black-box models , 2007 .

[6]  Moon,et al.  Estimation of mutual information using kernel density estimators. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[7]  Jonathan D. Rogers,et al.  Causation Entropy Identifies Sparsity Structure for Parameter Estimation of Dynamic Systems , 2017 .

[8]  Jonathan Rogers,et al.  Information Theoretic Causality Measures for System Identification of Mechanical Systems , 2018, Journal of Computational and Nonlinear Dynamics.

[9]  Erik M. Bollt,et al.  Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings , 2014, 1504.03769.

[10]  Paul Kabaila,et al.  On output-error methods for system identification , 1983 .

[11]  L. Györfi,et al.  Nonparametric entropy estimation. An overview , 1997 .

[12]  K. W. Iliff,et al.  Parameter Estimation for Flight Vehicles , 1989 .

[13]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[14]  Jared Elinger,et al.  Influence of Noise on Information Theoretic Causality Measures for System Identification , 2019, 2019 American Control Conference (ACC).

[15]  Eric N. Johnson,et al.  Adaptive Trajectory Control for Autonomous Helicopters , 2005 .

[16]  Jie Sun,et al.  Identifying the Coupling Structure in Complex Systems through the Optimal Causation Entropy Principle , 2014, Entropy.

[17]  R. Tibshirani,et al.  On the “degrees of freedom” of the lasso , 2007, 0712.0881.

[18]  H. Madsen,et al.  Modelling the heat consumption in district heating systems using a grey-box approach , 2006 .

[19]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.