Identification of source components in multivariate time series by state space modelling

In this paper we study the application of classical methods for dynamical modelling of time series to the task of decomposing multivariate time series into approximately independent source components, a task that has traditionally been addressed by Factor Analysis (FA) and more recently by Independent Component Analysis (ICA). Based on maximum-likelihood fitting of linear state space models we develop a new framework for this task, for which many of the limitations of standard ICA algorithms can be relieved. Through comparison of likelihood, or, more precisely, of the Akaike Information Criterion, it is demonstrated that dynamical modelling provides considerably better description of given data than FA and non-dynamical ICA. The comparison is applied to both simulated and real-world time series, the latter being given by an electrocardiogram and an electroencephalogram.

[1]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[2]  R. K. Mehra,et al.  Identification of stochastic linear dynamic systems using Kalman filter representation , 1971 .

[3]  Genshiro Kitagawa,et al.  The Practice of Time Series Analysis , 2011, Statistics for Engineering and Physical Science.

[4]  Pierre Comon Independent component analysis - a new concept? signal processing , 1994 .

[5]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[6]  Motoaki Kawanabe,et al.  A resampling approach to estimate the stability of one-dimensional or multidimensional independent components , 2002, IEEE Transactions on Biomedical Engineering.

[7]  Andrzej Cichocki,et al.  Adaptive blind signal and image processing , 2002 .

[8]  H. Akaike A new look at the statistical model identification , 1974 .

[9]  Yong-Lin Pi,et al.  Modal Identification of Vibrating Structures Using ARMA Model , 1989 .

[10]  Andreas Ziehe,et al.  TDSEP { an e(cid:14)cient algorithm for blind separation using time structure , 1998 .

[11]  R. Vautard,et al.  Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series , 1989 .

[12]  N. Davies Multiple Time Series , 2005 .

[13]  Andrzej Cichocki,et al.  Flexible Independent Component Analysis , 2000, J. VLSI Signal Process..

[14]  Michael A. West,et al.  Evaluation and Comparison of EEG Traces: Latent Structure in Nonstationary Time Series , 1999 .

[15]  Lei Xu,et al.  Dual multivariate auto-regressive modeling in state space for temporal signal separation , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[16]  José Carlos Príncipe,et al.  Generalized anti-Hebbian learning for source separation , 1999, 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258).

[17]  Mohinder S. Grewal,et al.  Kalman Filtering: Theory and Practice Using MATLAB , 2001 .

[18]  H. Akaike Prediction and Entropy , 1985 .

[19]  Seungjin Choi Blind Source Separation and Independent Component Analysis : A Review , 2004 .

[20]  Cécile Penland,et al.  Random Forcing and Forecasting Using Principal Oscillation Pattern Analysis , 1989 .

[21]  Aapo Hyvärinen,et al.  Fast and robust fixed-point algorithms for independent component analysis , 1999, IEEE Trans. Neural Networks.

[22]  Tohru Ozaki,et al.  Modelling non-stationary variance in EEG time series by state space GARCH model , 2006, Comput. Biol. Medicine.

[23]  T. W. Anderson,et al.  The use of factor analysis in the statistical analysis of multiple time series , 1963 .

[24]  Siem Jan Koopman,et al.  State Space and Unobserved Component Models , 2004 .

[25]  G. P. King,et al.  Extracting qualitative dynamics from experimental data , 1986 .

[26]  T. Inouye,et al.  Quantification of EEG irregularity by use of the entropy of the power spectrum. , 1991, Electroencephalography and clinical neurophysiology.

[27]  Allan Kardec Barros,et al.  Extraction of Specific Signals with Temporal Structure , 2001, Neural Computation.

[28]  Soo-Young Lee Blind Source Separation and Independent Component Analysis: A Review , 2005 .

[29]  M. West,et al.  Time series decomposition , 1997 .

[30]  A. Kaiser,et al.  CONSIDERING TEMPORAL STRUCTURES IN INDEPENDENT COMPONENT ANALYSIS , 2003 .

[31]  Alexander Kraskov,et al.  Least-dependent-component analysis based on mutual information. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[32]  P. Whittle On the fitting of multivariate autoregressions, and the approximate canonical factorization of a spectral density matrix , 1963 .

[33]  E. B. Andersen,et al.  Modern factor analysis , 1961 .

[34]  P. Young,et al.  Time series analysis, forecasting and control , 1972, IEEE Transactions on Automatic Control.

[35]  H. Akaike Markovian Representation of Stochastic Processes and Its Application to the Analysis of Autoregressive Moving Average Processes , 1974 .

[36]  Schuster,et al.  Separation of a mixture of independent signals using time delayed correlations. , 1994, Physical review letters.

[37]  Lei Xu,et al.  Temporal BYY learning for state space approach, hidden Markov model, and blind source separation , 2000, IEEE Trans. Signal Process..

[38]  H. Akaike Factor analysis and AIC , 1987 .

[39]  J. L. Hock,et al.  An exact recursion for the composite nearest‐neighbor degeneracy for a 2×N lattice space , 1984 .

[40]  Arnold Neumaier,et al.  Estimation of parameters and eigenmodes of multivariate autoregressive models , 2001, TOMS.

[41]  K. Jöreskog Some contributions to maximum likelihood factor analysis , 1967 .