Learning conditional independence structure for high-dimensional uncorrelated vector processes

We formulate and analyze a graphical model selection method for inferring the conditional independence graph of a high-dimensional nonstationary Gaussian random process (time series) from a finite-length observation. The observed process samples are assumed uncorrelated over time but having a time-varying marginal distribution. The selection method is based on testing conditional variances obtained for small subsets of process components. This allows to cope with the high-dimensional regime, where the sample size can be (much) smaller than the process dimension. We characterize the required sample size such that the proposed selection method is successful with high probability.

[1]  R. Tibshirani,et al.  Sparse inverse covariance estimation with the graphical lasso. , 2008, Biostatistics.

[2]  R. Dahlhaus Graphical interaction models for multivariate time series1 , 2000 .

[3]  Michael I. Jordan,et al.  Learning graphical models for stationary time series , 2004, IEEE Transactions on Signal Processing.

[4]  Helmut Bölcskei,et al.  Compressive nonparametric graphical model selection for time series , 2013, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[5]  Larry A. Wasserman,et al.  Time varying undirected graphs , 2008, Machine Learning.

[6]  Martin J. Wainwright,et al.  Information-Theoretic Limits of Selecting Binary Graphical Models in High Dimensions , 2009, IEEE Transactions on Information Theory.

[7]  Gerald Matz,et al.  Nonstationary spectral analysis based on time-frequency operator symbols and underspread approximations , 2006, IEEE Transactions on Information Theory.

[8]  Michael Eichler,et al.  Graphical Models in Time Series Analysis , 1999 .

[9]  N. Meinshausen,et al.  High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.

[10]  Noureddine El Karoui,et al.  Operator norm consistent estimation of large-dimensional sparse covariance matrices , 2008, 0901.3220.

[11]  Alexander Jung,et al.  On the information-theoretic limits of graphical model selection for Gaussian time series , 2014, 2014 22nd European Signal Processing Conference (EUSIPCO).

[12]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[13]  A Note on Exponential Inequalities of ψ-Weakly Dependent Sequences , 2014 .

[14]  Richard A. Davis,et al.  Time Series: Theory and Methods , 2013 .

[15]  J. Norris Appendix: probability and measure , 1997 .

[16]  Amos Lapidoth,et al.  A Foundation In Digital Communication: Index , 2009 .

[17]  Alexander Jung,et al.  Learning the Conditional Independence Structure of Stationary Time Series: A Multitask Learning Approach , 2014, IEEE Transactions on Signal Processing.

[18]  J. Lafferty,et al.  High-dimensional Ising model selection using ℓ1-regularized logistic regression , 2010, 1010.0311.

[19]  I. Ebert‐Uphoff,et al.  A new type of climate network based on probabilistic graphical models: Results of boreal winter versus summer , 2012 .

[20]  S. Mallat,et al.  Estimating covariances of locally stationary processes: consistency of best basis methods , 1996, Proceedings of Third International Symposium on Time-Frequency and Time-Scale Analysis (TFTS-96).

[21]  Andrea Montanari,et al.  Learning Networks of Stochastic Differential Equations , 2010, NIPS.

[22]  Eric P. Xing,et al.  On Time Varying Undirected Graphs , 2011, AISTATS.

[23]  Robert D. Nowak,et al.  Causal Network Inference Via Group Sparse Regularization , 2011, IEEE Transactions on Signal Processing.

[24]  Alexander Jung,et al.  Graphical LASSO based Model Selection for Time Series , 2014, IEEE Signal Processing Letters.

[25]  Bart Deplancke,et al.  Gene Regulatory Networks , 2012, Methods in Molecular Biology.