The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning
暂无分享,去创建一个
[1] Thomas M. Cover,et al. Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..
[2] K. Abromeit. Music Received , 2023, Notes.
[3] Vladimir Vapnik,et al. Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .
[4] Leslie G. Valiant,et al. A theory of the learnable , 1984, STOC '84.
[5] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[6] David Haussler,et al. Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension , 1986, STOC '86.
[7] J. Stephen Judd,et al. On the complexity of loading shallow neural networks , 1988, J. Complex..
[8] David Haussler,et al. What Size Net Gives Valid Generalization? , 1989, Neural Computation.
[9] Eric B. Baum,et al. A Proposal for More Powerful Learning Algorithms , 1989, Neural Computation.
[10] Terrence J. Sejnowski,et al. Neural network models of sensory integration for improved vowel recognition , 1990, Proc. IEEE.
[11] A. G. Hoffmann,et al. Connectionist functionality and the emergent network behavior , 1991, Neurocomputing.
[12] Yaser S. Abu-Mostafa,et al. Learning from hints in neural networks , 1990, J. Complex..