Comment: The Two Styles of VC Bounds
暂无分享,去创建一个
First of all, I would like to congratulate Leon Bottou on an excellent paper, containing a lucid discussion of the sources of looseness in the Vapnik–Chervonenkis bounds on the generalization error derived in Vapnik, Chervonenkis, Proc USSR Acad Sci 181(4): 781–783, 1968 and Vapnik, Chervonenkis, Theory Probab Appl 16(2): 264–280, (1971). I will comment only on the paper in this volume (Chap. 9), although most of the comments will also be applicable to the other papers co-authored by Leon and by Konstantin Vorontsov that are mentioned in Leon’s bibliography.
[1] G. Shafer,et al. Algorithmic Learning in a Random World , 2005 .
[2] Manfred K. Warmuth,et al. Relating Data Compression and Learnability , 2003 .
[3] Vladimir Vapnik,et al. Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .
[4] Manfred K. Warmuth,et al. Sample Compression, Learnability, and the Vapnik-Chervonenkis Dimension , 1995, Machine Learning.
[5] Vladimir Vapnik,et al. Statistical learning theory , 1998 .