Two-layer dynamic neural field learning law basec on controlled Lyapunov functions

The aim of this study was to develop a dynamic neural field (DNF) model to capture the essential non-linear characteristics of neural activity along several millimeters of visual cortex in response to local flashed stimuli. A two-layer DNF model was assessed to describe the response of both excitation and inhibitory layers of neurons. This particular structure of neurons interconnection was analyzed as a coupled system of non-linear integro-differential equations. This representation transformed the regular distributed form of DNF into an interconnected nonlinear model. A non-parametric modeling strategy yields to design the adjustment laws for the DNF weights. The algorithm used to adjust the weights considered self interconnections for each layer as well as external stimulus. The concept of controlled Lyapunov function served as the main tool to design a stable learning method for DNF. This algorithm was implemented in a class of hybrid computational model that served to execute the modeling of physiological response associated to visual external stimuli. The DNF model designed in this study can consider just the excitation response of specific neuron circuits without considering the presence of inhibitory response. This condition extends the number of electrophysiological trials where the adjusted DNF model can be evaluated. The learning method was evaluated with the information from a database that contains information coming from a selective visual attention experiment where the external stimuli appeared briefly in any of five squares arrayed horizontally above a central fixation cross. The degree of correlation (above 0.95) between signals measured at the brain cortex and the response of the DNF justified the application of the method proposed in this study.

[1]  Luca Citi,et al.  Documenting, modelling and exploiting P300 amplitude changes due to variable target delays in Donchin's speller , 2010, Journal of neural engineering.

[2]  T. Sejnowski,et al.  Dynamic Brain Sources of Visual Evoked Responses , 2002, Science.

[3]  Klaus Neumann,et al.  Neural learning of vector fields for encoding stable dynamical systems , 2014, Neurocomputing.

[4]  Christian Igel,et al.  A Dynamic Neural Field Model of Mesoscopic Cortical Activity Captured with Voltage-Sensitive Dye Imaging , 2010, PLoS Comput. Biol..

[5]  Mohd. Samar Ansari,et al.  Non-Linear Feedback Neural Networks - VLSI Implementations and Applications , 2013, Studies in Computational Intelligence.

[6]  Simon Haykin,et al.  Neural Networks and Learning Machines , 2010 .

[7]  Stephan K. U. Zibner,et al.  Dynamic Neural Fields as Building Blocks of a Cortex-Inspired Architecture for Robotic Scene Representation , 2011, IEEE Transactions on Autonomous Mental Development.

[8]  Karl J. Friston,et al.  Dynamic causal modeling with neural fields , 2012, NeuroImage.

[9]  Nikola Kasabov,et al.  Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition. , 2013, Neural networks : the official journal of the International Neural Network Society.

[10]  Pravin Varaiya,et al.  Ellipsoidal Techniques for Reachability Under State Constraints , 2006, SIAM J. Control. Optim..

[11]  I Chairez,et al.  Pattern recognition for electroencephalographic signals based on continuous neural networks. , 2016, Neural networks : the official journal of the International Neural Network Society.

[12]  Alexander S. Poznyak,et al.  Differential Neural Networks for Robust Nonlinear Control: Identification, State Estimation and Trajectory Tracking , 2001 .

[13]  Nicolas P. Rougier Dynamic neural field with local inhibition , 2005, Biological Cybernetics.

[14]  Alexander G. Loukianov,et al.  Discrete-Time High Order Neural Control - Trained with Kaiman Filtering , 2010, Studies in Computational Intelligence.

[15]  Eric Walter,et al.  Ellipsoidal parameter or state estimation under model uncertainty , 2004, Autom..

[16]  Gregor Schöner,et al.  Reference-related inhibition produces enhanced position discrimination and fast repulsion near axes of symmetry , 2006, Perception & psychophysics.

[17]  Neil E. Cotter,et al.  The Stone-Weierstrass theorem and its application to neural networks , 1990, IEEE Trans. Neural Networks.

[18]  Olivier D. Faugeras,et al.  Absolute Stability and Complete Synchronization in a Class of Neural Fields Models , 2008, SIAM J. Appl. Math..

[19]  Estela Bicho,et al.  The dynamic neural field approach to cognitive robotics , 2006, Journal of neural engineering.

[20]  Isaac Chairez Oria,et al.  Pattern recognition for electroencephalographic signals based on continuous neural networks , 2016, Neural Networks.