Chapter 24 - Information Based Learning
暂无分享,去创建一个
Badong Chen | Jose C. Principe | Luis G. Sanchez Giraldo | J. Príncipe | Badong Chen | L. S. Giraldo
[1] Xiaohong Jiang,et al. Generalized Two-Hop Relay for Flexible Delay Control in MANETs , 2012, IEEE/ACM Transactions on Networking.
[2] Badong Chen,et al. Survival Information Potential: A New Criterion for Adaptive System Training , 2012, IEEE Transactions on Signal Processing.
[3] Gavin Brown,et al. Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection , 2012, J. Mach. Learn. Res..
[4] José Carlos Príncipe,et al. A reproducing kernel Hilbert space formulation of the principle of relevant information , 2011, 2011 IEEE International Workshop on Machine Learning for Signal Processing.
[5] José Carlos Príncipe,et al. An efficient rank-deficient computation of the Principle of Relevant Information , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[6] Weifeng Liu,et al. An Information Theoretic Approach of Designing Sparse Kernel Adaptive Filters , 2009, IEEE Transactions on Neural Networks.
[7] Deniz Erdogmus,et al. Information Theoretic Learning , 2005, Encyclopedia of Artificial Intelligence.
[8] José Carlos Príncipe,et al. A Reproducing Kernel Hilbert Space Framework for Information-Theoretic Learning , 2008, IEEE Transactions on Signal Processing.
[9] Robert Jenssen,et al. Information cut for clustering using a gradient descent approach , 2007, Pattern Recognit..
[10] Robert Jenssen,et al. Kernel Maximum Entropy Data Transformation and an Enhanced Spectral Clustering Algorithm , 2006, NIPS.
[11] J.C. Principe,et al. From linear adaptive filtering to nonlinear information processing - The design and analysis of information processing systems , 2006, IEEE Signal Processing Magazine.
[12] Robert Jenssen,et al. Some Equivalences between Kernel Methods and Information Theoretic Methods , 2006, J. VLSI Signal Process..
[13] Rabab K. Ward,et al. 14 FROM LINEAR ADAPTIVE FILTERING TO NONLINEAR INFORMATION PROCESSING , 2006 .
[14] Erwin Lutwak,et al. Crame/spl acute/r-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information , 2005, IEEE Transactions on Information Theory.
[15] Yunmei Chen,et al. Cumulative residual entropy: a new measure of information , 2004, IEEE Transactions on Information Theory.
[16] Deniz Erdogmus,et al. Beyond second-order statistics for learning: A pairwise interaction model for entropy estimation , 2002, Natural Computing.
[17] Deniz Erdogmus,et al. Blind source separation using Renyi's -marginal entropies , 2002, Neurocomputing.
[18] Deniz Erdogmus,et al. Generalized information potential criterion for adaptive system training , 2002, IEEE Trans. Neural Networks.
[19] Deniz Erdogmus,et al. An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems , 2002, IEEE Trans. Signal Process..
[20] Deniz Erdoğmuş,et al. Blind source separation using Renyi's mutual information , 2001, IEEE Signal Processing Letters.
[21] John W. Fisher,et al. Learning from Examples with Information Theoretic Criteria , 2000, J. VLSI Signal Process..
[22] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[23] Naftali Tishby,et al. Data Clustering by Markovian Relaxation and the Information Bottleneck Method , 2000, NIPS.
[24] B. Scholkopf,et al. Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).
[25] L. Györfi,et al. Nonparametric entropy estimation. An overview , 1997 .
[26] Terrence J. Sejnowski,et al. An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.
[27] H. B. Barlow,et al. Unsupervised Learning , 1989, Neural Computation.
[28] Ralph Linsker,et al. Self-organization in a perceptual network , 1988, Computer.
[29] Paul Kalata,et al. Linear prediction, filtering, and smoothing: An information-theoretic approach , 1979, Inf. Sci..
[30] E. Pfaffelhuber. Learning and information theory. , 1972, The International journal of neuroscience.
[31] Edwin B. Stear,et al. Entropy analysis of estimating systems , 1970, IEEE Trans. Inf. Theory.
[32] E. Nadaraya. On Estimating Regression , 1964 .
[33] E. Parzen. On Estimation of a Probability Density Function and Mode , 1962 .
[34] E. Jaynes. Information Theory and Statistical Mechanics , 1957 .
[35] Par N. Aronszajn. La théorie des noyaux reproduisants et ses applications Première Partie , 1943, Mathematical Proceedings of the Cambridge Philosophical Society.
[36] J. Mercer. Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations , 1909 .