Local Rademacher Complexity Machine
暂无分享,去创建一个
[1] Davide Anguita,et al. Maximal Discrepancy vs. Rademacher Complexity for error estimation , 2011, ESANN.
[2] Przemyslaw Klesk,et al. Sets of approximating functions with finite Vapnik-Chervonenkis dimension for nearest-neighbors algorithms , 2011, Pattern Recognit. Lett..
[3] Peter L. Bartlett,et al. Rademacher and Gaussian Complexities: Risk Bounds and Structural Results , 2003, J. Mach. Learn. Res..
[4] Davide Anguita,et al. In-Sample and Out-of-Sample Model Selection and Error Estimation for Support Vector Machines , 2012, IEEE Transactions on Neural Networks and Learning Systems.
[5] P. Bartlett,et al. Local Rademacher complexities , 2005, math/0508275.
[6] Davide Anguita,et al. A Deep Connection Between the Vapnik–Chervonenkis Entropy and the Rademacher Complexity , 2014, IEEE Transactions on Neural Networks and Learning Systems.
[7] Ryan M. Rifkin,et al. In Defense of One-Vs-All Classification , 2004, J. Mach. Learn. Res..
[8] Constantin F. Aliferis,et al. A comprehensive evaluation of multicategory classification methods for microarray gene expression cancer diagnosis , 2004, Bioinform..
[9] Davide Anguita,et al. Global Rademacher Complexity Bounds: From Slow to Fast Convergence Rates , 2015, Neural Processing Letters.
[10] Davide Anguita,et al. Random Forests Model Selection , 2016, ESANN.
[11] Zaïd Harchaoui,et al. Rademacher Complexity Bounds for a Penalized Multiclass Semi-Supervised Algorithm , 2016, J. Artif. Intell. Res..
[12] Mansooreh Mollaghasemi,et al. Local Rademacher Complexity-based Learning Guarantees for Multi-Task Learning , 2016, J. Mach. Learn. Res..
[13] Davide Anguita,et al. Learning with few bits on small-scale devices: From regularization to energy efficiency , 2014, ESANN.
[14] Dacheng Tao,et al. Algorithmic Stability and Hypothesis Complexity , 2017, ICML.
[15] André Carlos Ponce de Leon Ferreira de Carvalho,et al. A review on the combination of binary classifiers in multiclass problems , 2008, Artificial Intelligence Review.
[16] Vladimir Koltchinskii,et al. Rademacher penalties and structural risk minimization , 2001, IEEE Trans. Inf. Theory.
[17] Nello Cristianini,et al. Kernel Methods for Pattern Analysis , 2003, ICTAI.
[18] Marius Kloft,et al. Learning Kernels Using Local Rademacher Complexity , 2013, NIPS.
[19] Chih-Jen Lin,et al. Asymptotic Behaviors of Support Vector Machines with Gaussian Kernel , 2003, Neural Computation.
[20] Alessandro Sperduti,et al. Measuring the expressivity of graph kernels through Statistical Learning Theory , 2017, Neurocomputing.
[21] Thomas Gärtner,et al. On Graph Kernels: Hardness Results and Efficient Alternatives , 2003, COLT.
[22] Alessandro Sperduti,et al. Learning With Kernels: A Local Rademacher Complexity-Based Analysis With Application to Graph Kernels , 2018, IEEE Transactions on Neural Networks and Learning Systems.
[23] Lixin Ding,et al. Local Rademacher complexity bounds based on covering numbers , 2015, Neurocomputing.
[24] Corinna Cortes,et al. Support-Vector Networks , 1995, Machine Learning.
[25] Vladimir Vapnik,et al. Statistical learning theory , 1998 .
[26] Davide Anguita,et al. In-sample Model Selection for Trimmed Hinge Loss Support Vector Machine , 2012, Neural Processing Letters.
[27] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[28] Dacheng Tao,et al. Local Rademacher Complexity for Multi-Label Learning , 2014, IEEE Transactions on Image Processing.
[29] Peter L. Bartlett,et al. Localized Rademacher Complexities , 2002, COLT.
[30] Brendan J. Frey,et al. Are Random Forests Truly the Best Classifiers? , 2016, J. Mach. Learn. Res..
[31] Vasilis Syrgkanis. A Sample Complexity Measure with Applications to Learning Optimal Auctions , 2017, NIPS.
[32] Davide Anguita,et al. Local Rademacher Complexity: Sharper risk bounds with and without unlabeled samples , 2015, Neural Networks.
[33] Michaël Aupetit. Nearly homogeneous multi-partitioning with a deterministic generator , 2009, Neurocomputing.
[34] Shiliang Sun,et al. A review of optimization methodologies in support vector machines , 2011, Neurocomputing.
[35] Massimiliano Pontil,et al. Support Vector Machines: Theory and Applications , 2001, Machine Learning and Its Applications.
[36] Luca Oneto,et al. Model selection and error estimation without the agonizing pain , 2018, WIREs Data Mining Knowl. Discov..
[37] S. Sathiya Keerthi,et al. Which Is the Best Multiclass SVM Method? An Empirical Study , 2005, Multiple Classifier Systems.
[38] Vladimir Vapnik,et al. Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .
[39] V. Koltchinskii. Local Rademacher complexities and oracle inequalities in risk minimization , 2006, 0708.0083.
[40] Nello Cristianini,et al. Classification using String Kernels , 2000 .
[41] Steve Hanneke,et al. Localization of VC Classes: Beyond Local Rademacher Complexities , 2016, ALT.
[42] Davide Anguita,et al. A local Vapnik-Chervonenkis complexity , 2016, Neural Networks.
[43] Bernhard Schölkopf,et al. The Kernel Trick for Distances , 2000, NIPS.
[44] Lorenzo Rosasco,et al. Are Loss Functions All the Same? , 2004, Neural Computation.
[45] D. Anguita,et al. A new method for multiclass support vector machines , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).