An introduction to information theoretic learning

Learning from examples has been traditionally done with correlation or with the mean square error (MSE) criterion, in spite of the fact that learning is intrinsically related with the extraction of information from examples. The problem is that Shannon (1948) introduced the idea of information entropy which has a sound theoretical foundation but is not easy to implement in a learning from examples scenario. In this paper Renyi's entropy definition (1976) is used and integrated with a nonparametric estimator of the probability density function (Parzen window). The experimental results on blind source separation confirm the theory. Although the work is preliminary, the "information potential" method is rather general and will have many applications.