Informational Energy Kernel for LVQ

We describe a kernel method which uses the maximization of Onicescu's informational energy as a criteria for computing the relevances of input features. This adaptive relevance determination is used in combination with the neural-gas and the generalized relevance LVQ algorithms. Our quadratic optimization function, as an L2 type method, leads to linear gradient and thus easier computation. We obtain an approximation formula similar to the mutual information based method, but in a more simple way.

[1]  Deniz Erdoğmuş,et al.  Towards a unification of information theoretic learning and kernel methods , 2004, Proceedings of the 2004 14th IEEE Signal Processing Society Workshop Machine Learning for Signal Processing, 2004..

[2]  R. Andonie,et al.  Energy generalized LVQ with relevance factors , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[3]  Tommy W. S. Chow,et al.  Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information , 2005, IEEE Transactions on Neural Networks.

[4]  Christopher J. Merz,et al.  UCI Repository of Machine Learning Databases , 1996 .

[5]  Thomas Martinetz,et al.  'Neural-gas' network for vector quantization and its application to time-series prediction , 1993, IEEE Trans. Neural Networks.

[6]  Thomas Villmann,et al.  Supervised Neural Gas with General Similarity Measure , 2005, Neural Processing Letters.

[7]  G. Battail Théorie de l'information , 1982, Électronique.

[8]  Barbara Hammer,et al.  Relevance determination in Learning Vector Quantization , 2001, ESANN.

[9]  Thomas Villmann,et al.  Generalized relevance learning vector quantization , 2002, Neural Networks.

[10]  Angel Cataron,et al.  An informational energy LVQ approach for feature ranking , 2004, ESANN.

[11]  Silviu Guiaşu,et al.  Information theory with applications , 1977 .

[12]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .