An introduction to information theoretic learning
暂无分享,去创建一个
[1] J. Príncipe,et al. Nonlinear extensions to the minimum average correlation energy filter , 1997 .
[2] Shun-ichi Amari,et al. Adaptive Online Learning Algorithms for Blind Separation: Maximum Entropy and Minimum Mutual Information , 1997, Neural Computation.
[3] A. Rényi. On Measures of Entropy and Information , 1961 .
[4] Jagat Narain Kapur,et al. Measures of information and their applications , 1994 .
[5] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[6] John W. Fisher,et al. Pose estimation in SAR using an information theoretic criterion , 1998, Defense, Security, and Sensing.
[7] J. Cardoso. Infomax and maximum likelihood for blind source separation , 1997, IEEE Signal Processing Letters.
[8] Terrence J. Sejnowski,et al. Blind separation and blind deconvolution: an information-theoretic approach , 1995, 1995 International Conference on Acoustics, Speech, and Signal Processing.
[9] E. Parzen. On Estimation of a Probability Density Function and Mode , 1962 .
[10] Paul A. Viola,et al. Empirical Entropy Manipulation for Real-World Problems , 1995, NIPS.
[11] Ralph Linsker,et al. Self-organization in a perceptual network , 1988, Computer.
[12] R. Hartley. Transmission of information , 1928 .