Dimensionality reduction for time series data

Despite the fact that they do not consider the temporal nature of data, classic dimensionality reduction techniques, such as PCA, are widely applied to time series data. In this paper, we introduce a factor decomposition specific for time series that builds upon the Bayesian multivariate autoregressive model and hence evades the assumption that data points are mutually independent. The key is to find a low-rank estimation of the autoregressive matrices. As in the probabilistic version of other factor models, this induces a latent low-dimensional representation of the original data. We discuss some possible generalisations and alternatives, with the most relevant being a technique for simultaneous smoothing and dimensionality reduction. To illustrate the potential applications, we apply the model on a synthetic data set and different types of neuroimaging data (EEG and ECoG).

[1]  Stephen J. Roberts,et al.  Ensemble Hidden Markov Models with Extended Observation Densities for Biosignal Analysis , 2005 .

[2]  Chong Wang,et al.  Variational Bayesian Approach to Canonical Correlation Analysis , 2007, IEEE Transactions on Neural Networks.

[3]  B. Silverman,et al.  Functional Data Analysis , 1997 .

[4]  Terrence J. Sejnowski,et al.  Variational Learning for Switching State-Space Models , 2001 .

[5]  Z. Keirn,et al.  A new mode of communication between man and his surroundings , 1990, IEEE Transactions on Biomedical Engineering.

[6]  Yukiyasu Kamitani,et al.  Modular Encoding and Decoding Models Derived from Bayesian Canonical Correlation Analysis , 2013, Neural Computation.

[7]  G. Kitagawa Monte Carlo Filter and Smoother for Non-Gaussian Nonlinear State Space Models , 1996 .

[8]  Stephen M. Smith,et al.  Probabilistic independent component analysis for functional magnetic resonance imaging , 2004, IEEE Transactions on Medical Imaging.

[9]  Concha Bielza,et al.  Bayesian Sparse Partial Least Squares , 2013, Neural Computation.

[10]  S. Roberts,et al.  Bayesian multivariate autoregressive models with structured priors , 2002 .

[11]  P. Young,et al.  Time series analysis, forecasting and control , 1972, IEEE Transactions on Automatic Control.

[12]  David J. Fleet,et al.  Gaussian Process Dynamical Models , 2005, NIPS.

[13]  Christopher M. Bishop,et al.  Bayesian PCA , 1998, NIPS.

[14]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[15]  Matthew J. Beal Variational algorithms for approximate Bayesian inference , 2003 .

[16]  Arne D. Ekstrom,et al.  Brain Oscillations Control Timing of Single-Neuron Activity in Humans , 2007, The Journal of Neuroscience.

[17]  Thomas B. Schön,et al.  Marginalized particle filters for mixed linear/nonlinear state-space models , 2005, IEEE Transactions on Signal Processing.

[18]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[19]  David Barber,et al.  Unified Inference for Variational Bayesian Linear Gaussian State-Space Models , 2006, NIPS.

[20]  Bin He,et al.  Classifying EEG-based motor imagery tasks by means of time–frequency synthesized spatial patterns , 2004, Clinical Neurophysiology.

[21]  Michael I. Jordan,et al.  Nonparametric Bayesian Learning of Switching Linear Dynamical Systems , 2008, NIPS.

[22]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[23]  Lee M. Miller,et al.  Measuring interregional functional connectivity using coherence and partial coherence analyses of fMRI data , 2004, NeuroImage.

[24]  Weihua Li,et al.  Recursive PCA for Adaptive Process Monitoring , 1999 .

[25]  Naotaka Fujii,et al.  Long-Term Asynchronous Decoding of Arm Motion Using Electrocorticographic Signals in Monkeys , 2009, Front. Neuroeng..