Multi-view Positive and Unlabeled Learning
暂无分享,去创建一个
[1] Dell Zhang,et al. Learning classifiers without negative examples: A reduction approach , 2008, 2008 Third International Conference on Digital Information Management.
[2] Qiang Yang,et al. One-Class Collaborative Filtering , 2008, 2008 Eighth IEEE International Conference on Data Mining.
[3] Mikhail Belkin,et al. Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..
[4] Rémi Gilleron,et al. Text Classification from Positive and Unlabeled Examples , 2002 .
[5] Philip S. Yu,et al. Partially Supervised Classification of Text Documents , 2002, ICML.
[6] Takafumi Kanamori,et al. A Least-squares Approach to Direct Importance Estimation , 2009, J. Mach. Learn. Res..
[7] Bernhard Schölkopf,et al. Estimating the Support of a High-Dimensional Distribution , 2001, Neural Computation.
[8] Philip S. Yu,et al. Positive Unlabeled Learning for Data Stream Classification , 2009, SDM.
[9] See-Kiong Ng,et al. Ensemble Based Positive Unlabeled Learning for Time Series Classification , 2012, DASFAA.
[10] Zhiwu Lu,et al. Image categorization with spatial mismatch kernels , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.
[11] Robert H. Halstead,et al. Matrix Computations , 2011, Encyclopedia of Parallel Computing.
[12] Anthony Widjaja,et al. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.
[13] Motoaki Kawanabe,et al. Direct Importance Estimation with Model Selection and Its Application to Covariate Shift Adaptation , 2007, NIPS.
[14] Masashi Sugiyama,et al. Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation , 2012 .
[15] Philip S. Yu,et al. Positive and Unlabeled Learning for Graph Classification , 2011, 2011 IEEE 11th International Conference on Data Mining.
[16] Mikhail Belkin,et al. A Co-Regularization Approach to Semi-supervised Learning with Multiple Views , 2005 .
[17] Rémi Gilleron,et al. Learning from positive and unlabeled examples , 2000, Theor. Comput. Sci..
[18] Takafumi Kanamori,et al. Statistical analysis of kernel-based least-squares density-ratio estimation , 2012, Machine Learning.
[19] Frann Cois Denis,et al. PAC Learning from Positive Statistical Queries , 1998, ALT.
[20] Gene H. Golub,et al. Matrix computations (3rd ed.) , 1996 .
[21] Philip S. Yu,et al. Building text classifiers using positive and unlabeled examples , 2003, Third IEEE International Conference on Data Mining.
[22] F. Denis. Classification and Co-training from Positive and Unlabeled Examples , 2003 .
[23] Vikas Sindhwani,et al. An RKHS for multi-view learning and manifold co-regularization , 2008, ICML '08.
[24] Nello Cristianini,et al. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .
[25] Xiaoli Li,et al. Learning to Classify Texts Using Positive and Unlabeled Data , 2003, IJCAI.
[26] John Shawe-Taylor,et al. Canonical Correlation Analysis: An Overview with Application to Learning Methods , 2004, Neural Computation.
[27] Avrim Blum,et al. The Bottleneck , 2021, Monopsony Capitalism.
[28] Masashi Sugiyama,et al. On Information-Maximization Clustering: Tuning Parameter Selection and Analytic Solution , 2011, ICML.
[29] Charles Elkan,et al. Learning classifiers from only positive and unlabeled data , 2008, KDD.
[30] Takafumi Kanamori,et al. Inlier-Based Outlier Detection via Direct Density Ratio Estimation , 2008, 2008 Eighth IEEE International Conference on Data Mining.