We report on two new improvements for the “Parameterised Sel f-Organizing Map” (PSOM). Both achieve a significant increase in mapping ac curacy and computational efficiency. For a growing number of training points the use of higher orde r polynomials to construct the PSOM “mapping manifold” in [7] can suffer fro m the increasing tendency to oscillate between the support points. We propos e here to confine the algorithm to a subset of the training knots, resulting in what we call the “local-PSOM” algorithm. This allows to avoid the use of highdegree polynomials without sacrificing accuracy. At the same time, the new approach offers a significant saving in required computations. A second way to improve the mapping preciseness makes use of t he superior approximation properties of Chebyshev polynomials fo r the PSOM mapping manifold. The benefits of the two new approaches are demonstrated with t o benchmark problems:(i) approximating a Gaussian bell function and (ii) learning of the (forward and inverse) kinematics of a 3 DOF robot finger. I n both cases the PSOM algorithm exhibit an excellent generalisation ability and that already for a very small training set size of 3 3 3 points.
[1]
H. Ritter.
Investment Learning with Hierarchical Psom
,
1995
.
[2]
Helge J. Ritter,et al.
Rapid learning with parametrized self-organizing maps
,
1996,
Neurocomputing.
[3]
Teuvo Kohonen,et al.
Self-Organization and Associative Memory
,
1988
.
[4]
C. Gielen,et al.
Neural computation and self-organizing maps, an introduction
,
1993
.
[5]
Helge Ritter,et al.
Parametrized Self-Organizing Maps
,
1993
.
[6]
Helge Ritter.
Parametrized Self-Organizing Maps for Vision Learning Tasks
,
1994
.