Efficient Computation of ℓ1 Regularized Estimates in Gaussian Graphical Models

In this article, I propose an effcient algorithm to compute ℓ1 regularized maximum likelihood estimates in the Gaussian graphical model. These estimators, recently proposed in an earlier article by Yuan and Lin, conduct parameter estimation and model selection simultaneously and have been shown to enjoy nice properties in both large and finite samples. To compute the estimates, however, can be very challenging in practice because of the high dimensionality and positive definiteness constraint on the covariance matrix. Taking advantage of the recent advance in semidefinite programming, Yuan and Lin suggested a sophisticated interior-point algorithm to solve the optimization problem. Although a polynomial time algorithm, the optimization technique is known not to be scalable for high-dimensional problems. Alternatively, this article shows that the estimates can be computed by iteratively solving a sequence of ℓ1 regularized quadratic programs. By effectively exploiting the sparsity of the graphical structure, I propose a new algorithm that can be applied to problems of larger scale. When combined with a path-following strategy, the new algorithm can be used to efficiently approximate the entire solution path of the ℓ1 regularized maximum likelihood estimates, which also facilitates the choice of tuning parameter. I demonstrate the efficacy and usefulness of the proposed algorithm on a few simulations and real datasets.

[1]  N. L. Johnson,et al.  Multivariate Analysis , 1958, Nature.

[2]  J. N. R. Jeffers,et al.  Graphical Models in Applied Multivariate Statistics. , 1990 .

[3]  Stephen Boyd,et al.  MAXDET: Software for Determinant Maximization Problems User's Guide , 1996 .

[4]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[5]  Stephen P. Boyd,et al.  Determinant Maximization with Linear Matrix Inequality Constraints , 1998, SIAM J. Matrix Anal. Appl..

[6]  Michael I. Jordan Graphical Models , 2003 .

[7]  M. R. Osborne,et al.  A new approach to variable selection in least squares problems , 2000 .

[8]  Graham J. Wills,et al.  Introduction to graphical modelling , 1995 .

[9]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[10]  M. Drton,et al.  Model selection for Gaussian concentration graphs , 2004 .

[11]  N. Meinshausen,et al.  Consistent neighbourhood selection for sparse high-dimensional graphs with the Lasso , 2004 .

[12]  P. Bühlmann,et al.  Sparse graphical Gaussian modeling of the isoprenoid gene network in Arabidopsis thaliana , 2004, Genome Biology.

[13]  M. Yuan,et al.  On the Nonnegative Garrote Estimator , 2005 .

[14]  Hongzhe Li,et al.  Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks. , 2006, Biostatistics.

[15]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[16]  N. Meinshausen,et al.  High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.

[17]  Mee Young Park,et al.  L1‐regularization path algorithm for generalized linear models , 2007 .

[18]  M. Yuan,et al.  Model selection and estimation in the Gaussian graphical model , 2007 .

[19]  M. Yuan,et al.  On the non‐negative garrotte estimator , 2007 .