Global Rademacher Complexity Bounds: From Slow to Fast Convergence Rates
暂无分享,去创建一个
S. Ridella | L. Oneto | D. Anguita | A. Ghio
[1] Davide Anguita,et al. Unlabeled patterns to tighten Rademacher complexity error bounds for kernel classifiers , 2014, Pattern Recognit. Lett..
[2] Marius Kloft,et al. Learning Kernels Using Local Rademacher Complexity , 2013, NIPS.
[3] Davide Anguita,et al. An improved analysis of the Rademacher data-dependent bound using its self bounding property , 2013, Neural Networks.
[4] David D. Jensen,et al. Copy or Coincidence? A Model for Detecting Social Influence and Duplication Events , 2013, ICML.
[5] Shie Mannor,et al. Online Learning for Time Series Prediction , 2013, COLT.
[6] Yiming Yang,et al. Bayesian models for Large-scale Hierarchical Classification , 2012, NIPS.
[7] Davide Anguita,et al. In-Sample and Out-of-Sample Model Selection and Error Estimation for Support Vector Machines , 2012, IEEE Transactions on Neural Networks and Learning Systems.
[8] Davide Anguita,et al. The Impact of Unlabeled Patterns in Rademacher Complexity Theory for Kernel Classifiers , 2011, NIPS.
[9] Davide Anguita,et al. Maximal Discrepancy for Support Vector Machines , 2011, ESANN.
[10] Gilles Blanchard,et al. The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning , 2011, NIPS.
[11] Avishek Saha,et al. Co-regularization Based Semi-supervised Domain Adaptation , 2010, NIPS.
[12] Ambuj Tewari,et al. Smoothness, Low Noise and Fast Rates , 2010, NIPS.
[13] Menno van Zaanen,et al. Rademacher Complexity and Grammar Induction Algorithms: What It May (Not) Tell Us , 2010, ICGI.
[14] Shiliang Sun,et al. Sparse Semi-supervised Learning Using Conjugate Functions , 2010, J. Mach. Learn. Res..
[15] Ohad Shamir,et al. Learnability, Stability and Uniform Convergence , 2010, J. Mach. Learn. Res..
[16] Xiaojin Zhu,et al. Introduction to Semi-Supervised Learning , 2009, Synthesis Lectures on Artificial Intelligence and Machine Learning.
[17] V. Bentkus,et al. An extension of the Hoeffding inequality to unbounded random variables , 2008 .
[18] Steven Abney,et al. Semisupervised Learning for Computational Linguistics , 2007 .
[19] Ran El-Yaniv,et al. Transductive Rademacher Complexity and Its Applications , 2007, COLT.
[20] A. Tsybakov,et al. Fast learning rates for plug-in classifiers , 2007, 0708.2321.
[21] Jean-Yves Audibert. Fast learning rates in statistical inference through aggregation , 2007, math/0703854.
[22] Peter L. Bartlett,et al. The Rademacher Complexity of Co-Regularized Kernel Classes , 2007, AISTATS.
[23] V. Koltchinskii. Rejoinder: Local Rademacher complexities and oracle inequalities in risk minimization , 2006, 0708.0135.
[24] Alexander Zien,et al. Semi-Supervised Learning , 2006 .
[25] O. Chapelle,et al. Semi-Supervised Learning , 2006 .
[26] Paulo J. G. Lisboa,et al. The Use of Artificial Neural Networks in Decision Support in Cancer: a Systematic Review , 2005 .
[27] J. Langford. Tutorial on Practical Prediction Theory for Classification , 2005, J. Mach. Learn. Res..
[28] P. Bartlett,et al. Local Rademacher complexities , 2005, math/0508275.
[29] Matti Kääriäinen,et al. Generalization Error Bounds Using Unlabeled Data , 2005, COLT.
[30] E. Rio,et al. Concentration around the mean for maxima of empirical processes , 2005, math/0506594.
[31] Nicu Sebe,et al. Semisupervised learning of classifiers: theory, algorithms, and their application to human-computer interaction , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[32] Mikhail Belkin,et al. Regularization and Semi-supervised Learning on Large Graphs , 2004, COLT.
[33] Mikhail Belkin,et al. Semi-Supervised Learning on Riemannian Manifolds , 2004, Machine Learning.
[34] T. Poggio,et al. General conditions for predictivity in learning theory , 2004, Nature.
[35] Peter L. Bartlett,et al. Rademacher and Gaussian Complexities: Risk Bounds and Structural Results , 2003, J. Mach. Learn. Res..
[36] Peter L. Bartlett,et al. Localized Rademacher Complexities , 2002, COLT.
[37] Vladimir Cherkassky,et al. Model complexity control and statistical learning theory , 2002, Natural Computing.
[38] André Elisseeff,et al. Stability and Generalization , 2002, J. Mach. Learn. Res..
[39] D. Panchenko. Some Extensions of an Inequality of Vapnik and Chervonenkis , 2002, math/0405342.
[40] Ming Li,et al. Sharpening Occam's razor , 2002, Inf. Process. Lett..
[41] Vladimir Koltchinskii,et al. Rademacher penalties and structural risk minimization , 2001, IEEE Trans. Inf. Theory.
[42] Robert A. Lordo,et al. Learning from Data: Concepts, Theory, and Methods , 2001, Technometrics.
[43] William Li,et al. Measuring the VC-Dimension Using Optimized Experimental Design , 2000, Neural Computation.
[44] Peter L. Bartlett,et al. Model Selection and Error Estimation , 2000, Machine Learning.
[45] John Langford,et al. Computable Shell Decomposition Bounds , 2000, J. Mach. Learn. Res..
[46] S. Boucheron,et al. A sharp concentration inequality with applications , 1999, Random Struct. Algorithms.
[47] E. Mammen,et al. Smooth Discrimination Analysis , 1999 .
[48] S. Boucheron,et al. A sharp concentration inequality with applications , 1999, Random Struct. Algorithms.
[49] David A. McAllester. PAC-Bayesian model averaging , 1999, COLT '99.
[50] Ayhan Demiriz,et al. Semi-Supervised Support Vector Machines , 1998, NIPS.
[51] John Shawe-Taylor,et al. Structural Risk Minimization Over Data-Dependent Hierarchies , 1998, IEEE Trans. Inf. Theory.
[52] William I. Gasarch,et al. Book Review: An introduction to Kolmogorov Complexity and its Applications Second Edition, 1997 by Ming Li and Paul Vitanyi (Springer (Graduate Text Series)) , 1997, SIGACT News.
[53] Gaston H. Gonnet,et al. On the LambertW function , 1996, Adv. Comput. Math..
[54] J. Parrondo,et al. Vapnik-Chervonenkis bounds for generalization , 1993 .
[55] M. Talagrand,et al. Probability in Banach Spaces: Isoperimetry and Processes , 1991 .
[56] Colin McDiarmid,et al. Surveys in Combinatorics, 1989: On the method of bounded differences , 1989 .
[57] David Haussler,et al. Occam's Razor , 1987, Inf. Process. Lett..
[58] E. S. Pearson,et al. THE USE OF CONFIDENCE OR FIDUCIAL LIMITS ILLUSTRATED IN THE CASE OF THE BINOMIAL , 1934 .
[59] Davide Anguita,et al. Energy Efficient Smartphone-Based Activity Recognition using Fixed-Point Arithmetic , 2013, J. Univers. Comput. Sci..
[60] Davide Anguita,et al. A Learning Machine with a Bit-Based Hypothesis Space , 2013, ESANN.
[61] Mikael Henaff,et al. A comprehensive evaluation of multicategory classification methods for microbiomic data , 2013 .
[62] Shiliang Sun,et al. PAC-bayes bounds with data dependent priors , 2012, J. Mach. Learn. Res..
[63] Davide Anguita,et al. Maximal Discrepancy vs. Rademacher Complexity for error estimation , 2011, ESANN.
[64] Shai Ben-David,et al. Does Unlabeled Data Provably Help? Worst-case Analysis of the Sample Complexity of Semi-Supervised Learning , 2008, COLT.
[65] Journal Url,et al. A tail inequality for suprema of unbounded empirical processes with applications to Markov chains , 2008 .
[66] Sally Floyd,et al. Sample compression, learnability, and the Vapnik-Chervonenkis dimension , 2004, Machine Learning.
[67] O. Bousquet. A Bennett concentration inequality and its application to suprema of empirical processes , 2002 .
[68] Thierry Klein. Une inégalité de concentration à gauche pour les processus empiriques , 2002 .
[69] S. Kutin. Extensions to McDiarmid's inequality when dierences are bounded with high probability , 2002 .
[70] Vladimir Vapnik,et al. Statistical learning theory , 1998 .
[71] M. Talagrand. A new look at independence , 1996 .
[72] Ming Li,et al. An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.