Auto-SOM: Recursive Parameter Estimation for Guidance of Self-Organizing Feature Maps

An important technique for exploratory data analysis is to form a mapping from the high-dimensional data space to a low-dimensional representation space such that neighborhoods are preserved. A popular method for achieving this is Kohonen's self-organizing map (SOM) algorithm. However, in its original form, this requires the user to choose the values of several parameters heuristically to achieve good performance. Here we present the Auto-SOM, an algorithm that estimates the learning parameters during the training of SOMs automatically. The application of Auto-SOM provides the facility to avoid neighborhood violations up to a user-defined degree in either mapping direction. Auto-SOM consists of a Kalman filter implementation of the SOM coupled with a recursive parameter estimation method. The Kalman filter trains the neurons' weights with estimated learning coefficients so as to minimize the variance of the estimation error. The recursive parameter estimation method estimates the width of the neighborhood function by minimizing the prediction error variance of the Kalman filter. In addition, the topographic function is incorporated to measure neighborhood violations and prevent the map's converging to configurations with neighborhood violations. It is demonstrated that neighborhoods can be preserved in both mapping directions as desired for dimension-reducing applications. The development of neighborhood-preserving maps and their convergence behavior is demonstrated by three examples accounting for the basic applications of self-organizing feature maps.

[1]  Thomas Martinetz,et al.  Topology representing networks , 1994, Neural Networks.

[2]  Gerald Sommer,et al.  Dynamic Cell Structure Learns Perfectly Topology Preserving Map , 1995, Neural Computation.

[3]  Lennart Ljung,et al.  Theory and Practice of Recursive Identification , 1983 .

[4]  Karin Haese,et al.  Self-organizing feature maps with self-adjusting learning parameters , 1998, IEEE Trans. Neural Networks.

[5]  M. Herrmann Self-Organizing Feature Maps with Self-Organizing Neighborhood Widths , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[6]  Thomas Villmann,et al.  Neural maps and topographic vector quantization , 1999, Neural Networks.

[7]  Thomas Villmann,et al.  Topology preservation in self-organizing feature maps: exact definition and measurement , 1997, IEEE Trans. Neural Networks.

[8]  Heinz-Dieter vom Stein,et al.  Fast self-organizing of n-dimensional topology maps , 1996, 1996 8th European Signal Processing Conference (EUSIPCO 1996).

[9]  Thomas Villmann,et al.  Growing a hypercubical output space in a self-organizing feature map , 1997, IEEE Trans. Neural Networks.

[10]  Karin Haese,et al.  Kalman Filter Implementation of Self-Organizing Feature Maps , 1999, Neural Computation.

[11]  C. Bishop,et al.  Analysis of multiphase flows using dual-energy gamma densitometry and neural networks , 1993 .

[12]  Ralf Der,et al.  Controlling the Magnification Factor of Self-Organizing Feature Maps , 1996, Neural Computation.

[13]  Teuvo Kohonen,et al.  Self-organization and associative memory: 3rd edition , 1989 .

[14]  Christopher K. I. Williams,et al.  Magnification factors for the SOM and GTM algorithms , 1997 .

[15]  Hujun Yin,et al.  On the Distribution and Convergence of Feature Space in Self-Organizing Maps , 1995, Neural Computation.

[16]  Lennart Ljung,et al.  System Identification: Theory for the User , 1987 .

[17]  Christopher M. Bishop,et al.  GTM: The Generative Topographic Mapping , 1998, Neural Computation.

[18]  Ralf Der,et al.  Constructing principal manifolds in sparse data sets by self-organizing maps with self-regulating neighborhood width , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).

[19]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .