Tracking the states of a nonlinear and nonstationary system in the weight-space of artificial neural networks

We propose a novel interpretation and usage of Neural Network (NN) in modeling physiological signals, which are allowed to be nonlinear and/or nonstationary. The method consists of training a NN for the k-step prediction of a physiological signal, and then examining the connection-weight-space (CWS) of the NN to extract information about the signal generator mechanism. We define a novel feature, Normalized Vector Separation (γij), to measure the separation of two arbitrary states “i” and “j” in the CWS and use it to track the state changes of the generating system. The performance of the method is examined via synthetic signals and clinical EEG. Synthetic data indicates that γij can track the system down to a SNR of 3.5 dB. Clinical data obtained from three patients undergoing carotid endarterectomy of the brain showed that EEG could be modeled (within a root-means-squared-error of 0.01) by the proposed method, and the blood perfusion state of the brain could be monitored via γij, with small NNs having no more than 21 connection weight altogether.

[1]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[2]  M. Oud Internal-state analysis in a layered artificial neural network trained to categorize lung sounds , 2002, IEEE Trans. Syst. Man Cybern. Part A.

[3]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[4]  Terrence J. Sejnowski,et al.  Analysis of hidden units in a layered network trained to classify sonar targets , 1988, Neural Networks.

[5]  Li Liu,et al.  Neural network modeling for surgical decisions on traumatic brain injury patients , 2000, Int. J. Medical Informatics.

[6]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[7]  F. Takens Detecting strange attractors in turbulence , 1981 .

[8]  Philipp Slusallek,et al.  Introduction to real-time ray tracing , 2005, SIGGRAPH Courses.

[9]  Klaus Lehnertz,et al.  Discerning nonstationarity from nonlinearity in seizure-free and preseizure EEG recordings from epilepsy patients , 2003, IEEE Transactions on Biomedical Engineering.

[10]  Zoltán Benyó,et al.  A novel method for the detection of apnea and hypopnea events in respiration signals , 2002, IEEE Transactions on Biomedical Engineering.

[11]  K. Ferguson,et al.  Neural network prediction of obstructive sleep apnea from clinical criteria. , 1999, Chest.

[12]  James P. Crutchfield,et al.  Geometry from a Time Series , 1980 .

[13]  M. Casdagli,et al.  Nonstationarity in epileptic EEG and implications for neural dynamics. , 1998, Mathematical biosciences.

[14]  Y. Cisse BP Neural Networks Approach for Identifying Biological Signal Source in Circadian Data Fluctuations , 2002 .

[15]  H. Jasper,et al.  The ten-twenty electrode system of the International Federation. The International Federation of Clinical Neurophysiology. , 1999, Electroencephalography and clinical neurophysiology. Supplement.

[16]  Eiji Watanabe,et al.  A Prediction Method of Non-Stationary Time Series Data by Using a Modular Structured Neural Network (Special Section on Signal Processing for Nonstationary Processes Based on Modelling) , 1997 .

[17]  Geoffrey E. Hinton,et al.  How neural networks learn from experience. , 1992, Scientific American.

[18]  Sung Yang Bang,et al.  An improved time series prediction by applying the layer-by-layer learning method to FIR neural networks , 1997, Neural Networks.

[19]  J. E. Skinner,et al.  Chaos and physiology: deterministic chaos in excitable cell assemblies. , 1994, Physiological reviews.

[20]  Alan V. Oppenheim,et al.  Discrete-time Signal Processing. Vol.2 , 2001 .

[21]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[22]  Steven Kay,et al.  Modern Spectral Estimation: Theory and Application , 1988 .

[23]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.