Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications
暂无分享,去创建一个
[1] John Shawe-Taylor,et al. A Result of Vapnik with Applications , 1993, Discret. Appl. Math..
[2] Michael Kearns,et al. Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.
[3] Balas K. Natarajan,et al. Probably Approximate Learning Over Classes of Distributions , 1992, SIAM J. Comput..
[4] Robert H. Sloan,et al. Corrigendum to types of noise in data for concept learning , 1988, COLT '92.
[5] David Haussler,et al. Calculation of the learning curve of Bayes optimal classification algorithm for learning a perceptron with noise , 1991, COLT '91.
[6] Andrew R. Barron,et al. Minimum complexity density estimation , 1991, IEEE Trans. Inf. Theory.
[7] A. Dembo,et al. On Uniform Convergence for Dependent Processes , 1991, Proceedings. 1991 IEEE International Symposium on Information Theory.
[8] Opper,et al. Generalization performance of Bayes optimal classification algorithm for learning a perceptron. , 1991, Physical review letters.
[9] Wray L. Buntine,et al. Bayesian Back-Propagation , 1991, Complex Syst..
[10] Manfred K. Warmuth,et al. On the Computational Complexity of Approximating Distributions by Probabilistic Automata , 1990, COLT '90.
[11] Robert E. Schapire,et al. Efficient distribution-free learning of probabilistic concepts , 1990, Proceedings [1990] 31st Annual Symposium on Foundations of Computer Science.
[12] Sholom M. Weiss,et al. Computer Systems That Learn , 1990 .
[13] Halbert White,et al. Connectionist nonparametric regression: Multilayer feedforward networks can learn arbitrary mappings , 1990, Neural Networks.
[14] Sompolinsky,et al. Learning from examples in large neural networks. , 1990, Physical review letters.
[15] Kenji Yamanishi,et al. A learning criterion for stochastic rules , 1990, COLT '90.
[16] Andrew R. Barron,et al. Information-theoretic asymptotics of Bayes methods , 1990, IEEE Trans. Inf. Theory.
[17] D. Lindley. The 1988 Wald Memorial Lectures: The Present Position in Bayesian Statistics , 1990 .
[18] Tomaso A. Poggio,et al. Extensions of a Theory of Networks for Approximation and Learning , 1990, NIPS.
[19] Wray L. Buntine,et al. A theory of learning classification rules , 1990 .
[20] D. Pollard. Empirical Processes: Theory and Applications , 1990 .
[21] David Haussler,et al. Decision Theoretic Generalizations of the PAC Learning Model , 1990, ALT.
[22] David E. Rumelhart,et al. Predicting the Future: a Connectionist Approach , 1990, Int. J. Neural Syst..
[23] Vijaykumar Gullapalli,et al. A stochastic reinforcement learning algorithm for learning real-valued functions , 1990, Neural Networks.
[24] Michael I. Jordan,et al. Advances in Neural Information Processing Systems 30 , 1995 .
[25] A. Barron,et al. Statistical properties of artificial neural networks , 1989, Proceedings of the 28th IEEE Conference on Decision and Control,.
[26] Halbert White,et al. Learning in Artificial Neural Networks: A Statistical Perspective , 1989, Neural Computation.
[27] Naftali Tishby,et al. Consistent inference of probabilities in layered networks: predictions and generalizations , 1989, International 1989 Joint Conference on Neural Networks.
[28] Vladimir Vapnik,et al. Inductive principles of the search for empirical dependences (methods based on weak convergence of probability measures) , 1989, COLT '89.
[29] David Haussler,et al. Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.
[30] John Moody,et al. Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.
[31] Kumpati S. Narendra,et al. Learning automata - an introduction , 1989 .
[32] David E. Rumelhart,et al. Product Units: A Computationally Powerful and Biologically Plausible Extension to Backpropagation Networks , 1989, Neural Computation.
[33] B. K. Natarajan,et al. Some results on learning , 1989 .
[34] S. Kulkarni,et al. On metric entropy, Vapnik-Chervonenkis dimension, and learnability for a class of distributions , 1989 .
[35] Steven J. Nowlan,et al. Maximum Likelihood Competitive Learning , 1989, NIPS.
[36] David Haussler,et al. What Size Net Gives Valid Generalization? , 1989, Neural Computation.
[37] Yann LeCun,et al. Optimal Brain Damage , 1989, NIPS.
[38] Alon Itai,et al. Learnability by fixed distributions , 1988, COLT '88.
[39] David Haussler,et al. Predicting {0,1}-functions on randomly drawn points , 1988, COLT '88.
[40] George Shackelford,et al. Learning k-DNF with noise in the attributes , 1988, Annual Conference Computational Learning Theory.
[41] David Haussler,et al. Equivalence of models for polynomial learnability , 1988, COLT '88.
[42] Leslie G. Valiant,et al. A general lower bound on the number of examples needed for learning , 1988, COLT '88.
[43] Nathan Linial,et al. Results on learnability and the Vapnik-Chervonenkis dimension , 1988, [Proceedings 1988] 29th Annual Symposium on Foundations of Computer Science.
[44] David Haussler,et al. Quantifying Inductive Bias: AI Learning Algorithms and Valiant's Learning Framework , 1988, Artif. Intell..
[45] Luc Devroye,et al. Automatic Pattern Recognition: A Study of the Probability of Error , 1988, IEEE Trans. Pattern Anal. Mach. Intell..
[46] J. Berger. Statistical Decision Theory and Bayesian Analysis , 1988 .
[47] Emo Welzl,et al. Partition trees for triangle counting and other range searching problems , 1988, SCG '88.
[48] Prasad Tadepalli,et al. Two New Frameworks for Learning , 1988, ML.
[49] David Haussler,et al. ɛ-nets and simplex range queries , 1987, Discret. Comput. Geom..
[50] N. Littlestone. Learning Quickly When Irrelevant Attributes Abound: A New Linear-Threshold Algorithm , 1987, 28th Annual Symposium on Foundations of Computer Science (sfcs 1987).
[51] R. Dudley. Universal Donsker Classes and Metric Entropy , 1987 .
[52] K. Alexander,et al. Rates of growth and sample moduli for weighted empirical processes indexed by sets , 1987 .
[53] D. Pollard,et al. $U$-Processes: Rates of Convergence , 1987 .
[54] Leslie G. Valiant,et al. On the learnability of Boolean formulae , 1987, STOC.
[55] Herbert Edelsbrunner,et al. Algorithms in Combinatorial Geometry , 1987, EATCS Monographs in Theoretical Computer Science.
[56] Lawrence D. Jackel,et al. Large Automatic Learning, Rule Extraction, and Generalization , 1987, Complex Syst..
[57] J. Rissanen. Stochastic Complexity and Modeling , 1986 .
[58] David Haussler,et al. Epsilon-nets and simplex range queries , 1986, SCG '86.
[59] James L. McClelland,et al. Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .
[60] P. Anandan,et al. Pattern-recognizing stochastic learning automata , 1985, IEEE Transactions on Systems, Man, and Cybernetics.
[61] Leslie G. Valiant,et al. A theory of the learnable , 1984, STOC '84.
[62] C. Sparrow. The Fractal Geometry of Nature , 1984 .
[63] D. Pollard. Convergence of stochastic processes , 1984 .
[64] R. Dudley. A course on empirical processes , 1984 .
[65] P. Assouad. Densité et dimension , 1983 .
[66] J. D. Farmer,et al. Information Dimension and the Probabilistic Structure of Chaos , 1982 .
[67] J. Yorke,et al. Dimension of chaotic attractors , 1982 .
[68] Richard M. Dudley,et al. Some special vapnik-chervonenkis classes , 1981, Discret. Math..
[69] R. Dudley. Central Limit Theorems for Empirical Measures , 1978 .
[70] Leslie G. Valiant,et al. Fast probabilistic algorithms for hamiltonian circuits and matchings , 1977, STOC '77.
[71] Richard O. Duda,et al. Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.
[72] Norbert Sauer,et al. On the Density of Families of Sets , 1972, J. Comb. Theory A.
[73] Vladimir Vapnik,et al. Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .
[74] Gerald S. Rogers,et al. Mathematical Statistics: A Decision Theoretic Approach , 1967 .
[75] George Finlay Simmons,et al. Introduction to Topology and Modern Analysis , 1963 .
[76] A. Kolmogorov,et al. Entropy and "-capacity of sets in func-tional spaces , 1961 .
[77] W. Loh,et al. Classification and regression trees , 2022 .