A Convex Formulation for Learning Scale-Free Networks via Submodular Relaxation
暂无分享,去创建一个
[1] M. Yuan,et al. Model selection and estimation in regression with grouped variables , 2006 .
[2] Francis R. Bach,et al. Structured Sparse Principal Component Analysis , 2009, AISTATS.
[3] Tom A. B. Snijders,et al. Markov Chain Monte Carlo Estimation of Exponential Random Graph Models , 2002, J. Soc. Struct..
[4] Julien Mairal,et al. Optimization with Sparsity-Inducing Penalties , 2011, Found. Trends Mach. Learn..
[5] P. Pattison,et al. New Specifications for Exponential Random Graph Models , 2006 .
[6] Vladimir Filkov,et al. Exploring biological network structure using exponential random graph models , 2007, Bioinform..
[7] Qiang Liu,et al. Learning Scale Free Networks by Reweighted L1 regularization , 2011, AISTATS.
[8] Shiqian Ma,et al. Sparse Inverse Covariance Selection via Alternating Linearization Methods , 2010, NIPS.
[9] M. West,et al. Sparse graphical models for exploring gene expression data , 2004 .
[10] Julien Mairal,et al. Proximal Methods for Sparse Hierarchical Dictionary Learning , 2010, ICML.
[11] Francis R. Bach,et al. Structured sparsity-inducing norms through submodular functions , 2010, NIPS.
[12] Albert,et al. Emergence of scaling in random networks , 1999, Science.
[13] R. Tibshirani,et al. Sparse inverse covariance estimation with the graphical lasso. , 2008, Biostatistics.
[14] Stephen P. Boyd,et al. Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..
[15] Francis R. Bach,et al. Convex Analysis and Optimization with Submodular Functions: a Tutorial , 2010, ArXiv.
[16] O. SIAMJ.,et al. SMOOTH OPTIMIZATION APPROACH FOR SPARSE COVARIANCE SELECTION∗ , 2009 .
[17] Amin Saberi,et al. A Sequential Algorithm for Generating Random Graphs , 2007, Algorithmica.
[18] Satoru Fujishige,et al. Submodular functions and optimization , 1991 .
[19] Stephen P. Boyd,et al. Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.