Characterizations of learnability for classes of {O, …, n}-valued functions
暂无分享,去创建一个
[1] J. Lamperti. ON CONVERGENCE OF STOCHASTIC PROCESSES , 1962 .
[2] Vladimir Vapnik,et al. Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .
[3] Mark G. Karpovsky,et al. Coordinate density of sets of vectors , 1978, Discret. Math..
[4] J. Michael Steele,et al. Existence of Submatrices with All Possible Columns , 1978, Journal of combinatorial theory. Series A.
[5] Noga Alon,et al. On the density of sets of vectors , 1983, Discret. Math..
[6] Leslie G. Valiant,et al. A theory of the learnable , 1984, STOC '84.
[7] R. Dudley. Universal Donsker Classes and Metric Entropy , 1987 .
[8] Leslie G. Valiant,et al. A general lower bound on the number of examples needed for learning , 1988, COLT '88.
[9] Balas K. Natarajan,et al. On learning sets and functions , 2004, Machine Learning.
[10] Vladimir Vapnik,et al. Inductive principles of the search for empirical dependences (methods based on weak convergence of probability measures) , 1989, COLT '89.
[11] David Haussler,et al. Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.
[12] D. Pollard. Empirical Processes: Theory and Applications , 1990 .
[13] Translator-IEEE Expert staff. Machine Learning: A Theoretical Approach , 1992, IEEE Expert.
[14] David Haussler,et al. Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications , 1992, Inf. Comput..
[15] John Shawe-Taylor,et al. Bounding Sample Size with the Vapnik-Chervonenkis Dimension , 1993, Discrete Applied Mathematics.
[16] Philip M. Long,et al. Fat-shattering and the learnability of real-valued functions , 1994, COLT '94.