A Novel Tool for Sequential Fusion of Nonlinear Features: A Sleep Psychology Application

A framework for automated scoring of sleep stages during afternoon naps of healthy humans is introduced. This is achieved by sequential fusion of nonlinear features extracted from three physiological channels: the electroencephalogram (EEG), electrooculogram (EOG) and respiratory trace (RES). These features are generated by means of the recently introduced "delay vector variance" (DVV) method which examines local predictability of a signal in phase space. The analysis is accompanied by a set of comprehensive simulations, supporting the approach

[1]  Gustavo Deco,et al.  Information dynamics - foundations and applications , 2000 .

[2]  I. Lemahieu,et al.  Automatic detection of sleep stages using the EEG , 2001, 2001 Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[3]  T. Schreiber,et al.  Surrogate time series , 1999, chao-dyn/9909037.

[4]  S. L. Goh,et al.  Sequential Data Fusion via Vector Spaces: Complex Modular Neural Network Approach , 2005, 2005 IEEE Workshop on Machine Learning for Signal Processing.

[5]  Danilo P Mandic,et al.  Indications of nonlinear structures in brain electrical activity. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[6]  Mo Chen,et al.  Fusion of State Space and Frequency- Domain Features for Improved Microsleep Detection , 2005, ICANN.

[7]  Klaus-Robert Müller,et al.  Identification of nonstationary dynamics in physiological recordings , 2000, Biological Cybernetics.

[8]  Pascal Vasseur,et al.  Introduction to Multisensor Data Fusion , 2005, The Industrial Information Technology Handbook.

[9]  D.P Mandic,et al.  On the characterization of the deterministic/stochastic and linear/nonlinear nature of time series , 2008, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[10]  Danilo P. Mandic,et al.  Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability , 2001 .

[11]  G. Deco,et al.  An Information-Theoretic Approach to Neural Computing , 1997, Perspectives in Neural Computing.

[12]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[13]  Anthony G. Constantinides,et al.  Data Fusion for Modern Engineering Applications: An Overview , 2005, ICANN.

[14]  Richard D. Braatz,et al.  On the "Identification and control of dynamical systems using neural networks" , 1997, IEEE Trans. Neural Networks.

[15]  J. A. Stewart,et al.  Nonlinear Time Series Analysis , 2015 .

[16]  Danilo P. Mandic,et al.  Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability , 2001 .

[17]  中澤 真,et al.  Devroye, L., Gyorfi, L. and Lugosi, G. : A Probabilistic Theory of Pattern Recognition, Springer (1996). , 1997 .

[18]  D. T. Kaplan,et al.  Exceptional events as evidence for determinism , 1994 .

[19]  Caro,et al.  AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS , 1993 .

[20]  M. Hulle,et al.  The Delay Vector Variance Method for Detecting Determinism and Nonlinearity in Time Series , 2004 .

[21]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .