An Information Theoretic Framework for Multi-view Learning
暂无分享,去创建一个
[1] H. Hotelling. The most predictable criterion. , 1935 .
[2] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[3] Avrim Blum,et al. The Bottleneck , 2021, Monopsony Capitalism.
[4] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[5] Sanjoy Dasgupta,et al. PAC Generalization Bounds for Co-training , 2001, NIPS.
[6] Tong Zhang,et al. Covering Number Bounds of Certain Regularized Linear Function Classes , 2002, J. Mach. Learn. Res..
[7] Gal Chechik,et al. Information Bottleneck for Gaussian Variables , 2003, J. Mach. Learn. Res..
[8] A. Tsybakov,et al. Optimal aggregation of classifiers in statistical learning , 2003 .
[9] Steven P. Abney. Understanding the Yarowsky Algorithm , 2004, CL.
[10] John Shawe-Taylor,et al. Two view learning: SVM-2K, Theory and Practice , 2005, NIPS.
[11] Mikhail Belkin,et al. A Co-Regularization Approach to Semi-supervised Learning with Multiple Views , 2005 .
[12] Maria-Florina Balcan,et al. A PAC-Style Model for Learning from Labeled and Unlabeled Data , 2005, COLT.
[13] Michael I. Jordan,et al. Convexity, Classification, and Risk Bounds , 2006 .
[14] Thomas Gärtner,et al. Efficient co-regularised least squares regression , 2006, ICML.
[15] Sham M. Kakade,et al. Multi-view Regression Via Canonical Correlation Analysis , 2007, COLT.
[16] Maria-Florina Balcan,et al. Open Problems in Efficient Semi-supervised PAC Learning , 2007, COLT.
[17] The rademacher complexity of coregularized kernel classes , 2007 .
[18] Ingo Steinwart,et al. Fast rates for support vector machines using Gaussian kernels , 2007, 0708.1838.
[19] Peter L. Bartlett,et al. The Rademacher Complexity of Co-Regularized Kernel Classes , 2007, AISTATS.