Evolutionary Optimization of Echo State Networks: Multiple Motor Pattern Learning

Echo State Networks are a special class of recurrent neural networks, that are well suited for attractor based learning of motor patterns. Using structural multiobjective optimization, the tradeoff between network size and accuracy can be identified. This allows to choose a feasible model capacity for a follow-up full weight optimization. Both optimization steps can be combined into a nested, hierarchical optimization procedure. It is shown to produce small and efficient networks, that are capable of storing multiple motor patterns in a single net. Especially the smaller networks can interpolate between learned patterns using bifurcation inputs.

[1]  Bruce A. Robinson,et al.  Self-Adaptive Multimethod Search for Global Optimization in Real-Parameter Spaces , 2009, IEEE Transactions on Evolutionary Computation.

[2]  A PearlmutterBarak Learning state space trajectories in recurrent neural networks , 1989 .

[3]  André Frank Krause,et al.  Direct Control of an Active Tactile Sensor Using Echo State Networks , 2009, Human Centered Robot Systems, Cognition, Interaction, Technology.

[4]  Jun Tani,et al.  Emergence of Functional Hierarchy in a Multiple Timescale Neural Network Model: A Humanoid Robot Experiment , 2008, PLoS Comput. Biol..

[5]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[6]  Mitsuo Kawato,et al.  MOSAIC Model for Sensorimotor Learning and Control , 2001, Neural Computation.

[7]  Herbert Jaeger,et al.  Echo state network , 2007, Scholarpedia.

[8]  Helge Ritter,et al.  Human Centered Robot Systems, Cognition, Interaction, Technology , 2009, Cognitive Systems Monographs.

[9]  Herbert Jaeger Generating exponentially many periodic attractors with linearly growing Echo State Networks , 2006 .

[10]  Barak A. Pearlmutter Learning State Space Trajectories in Recurrent Neural Networks , 1989, Neural Computation.

[11]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[12]  Jun Tani,et al.  Self-organization of distributedly represented multiple behavior schemata in a mirror system: reviews of robot experiments using RNNPB , 2004, Neural Networks.

[13]  Herbert Jaeger,et al.  A tutorial on training recurrent neural networks , covering BPPT , RTRL , EKF and the " echo state network " approach - Semantic Scholar , 2005 .

[14]  Yoshua Bengio,et al.  Gradient Flow in Recurrent Nets: the Difficulty of Learning Long-Term Dependencies , 2001 .

[15]  Oliver Kramer,et al.  Fast Blackbox Optimization: Iterated Local Search and the Strategy of Powell , 2009, GEM.

[16]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[17]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.