The VC Dimension and Pseudodimension of Two-Layer Neural Networks with Discrete Inputs
暂无分享,去创建一个
[1] Leslie G. Valiant,et al. A theory of the learnable , 1984, STOC '84.
[2] David Haussler,et al. What Size Net Gives Valid Generalization? , 1989, Neural Computation.
[3] David Haussler,et al. Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.
[4] David Haussler,et al. Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications , 1992, Inf. Comput..
[5] Peter L. Bartlett,et al. Vapnik-Chervonenkis Dimension Bounds for Two- and Three-Layer Networks , 1993, Neural Computation.
[6] Paul W. Goldberg,et al. Bounding the Vapnik-Chervonenkis Dimension of Concept Classes Parameterized by Real Numbers , 1993, COLT '93.
[7] A. Sakurai,et al. Tighter bounds of the VC-dimension of three layer networks , 1993 .
[8] Wolfgang Maass,et al. Bounds for the computational power and learning complexity of analog neural nets , 1993, SIAM J. Comput..
[9] Eduardo D. Sontag,et al. Finiteness results for sigmoidal “neural” networks , 1993, STOC.
[10] Marek Karpinski,et al. Polynomial bounds for VC dimension of sigmoidal neural networks , 1995, STOC '95.