A Wearable Multi-Modal Bio-Sensing System Towards Real-World Applications

Multi-modal bio-sensing has recently been used as effective research tools in affective computing, autism, clinical disorders, and virtual reality among other areas. However, none of the existing bio-sensing systems support multi-modality in a wearable manner outside well-controlled laboratory environments with research-grade measurements. This paper attempts to bridge this gap by developing a wearable multi-modal bio-sensing system capable of collecting, synchronizing, recording, and transmitting data from multiple bio-sensors: PPG, EEG, eye-gaze headset, body motion capture, GSR, etc., while also providing task modulation features including visual-stimulus tagging. This study describes the development and integration of various components of our system. We evaluate the developed sensors by comparing their measurements to those obtained by a standard research-grade bio-sensors. We first evaluate different sensor modalities of our headset, namely, earlobe-based PPG module with motion-noise canceling for ECG during heart-beat calculation. We also compare the steady-state visually evoked potentials measured by our shielded dry EEG sensors with the potentials obtained by commercially available dry EEG sensors. We also investigate the effect of head movements on the accuracy and precision of our wearable eye-gaze system. Furthermore, we carry out two practical tasks to demonstrate the applications of using multiple sensor modalities for exploring previously unanswerable questions in bio-sensing. Specifically, utilizing bio-sensing, we show which strategy works best for playing “Where is Waldo?” visual-search game, changes in EEG corresponding to true vs. false target fixations in this game, and predicting the loss/draw/win states through bio-sensing modalities while learning their limitations in a “Rock-Paper-Scissors” game.

[1]  Mladen Russo,et al.  Wearable Emotion Recognition System based on GSR and PPG Signals , 2017, MMHealth@MM.

[2]  A. Mike Burton,et al.  N250 ERP Correlates of the Acquisition of Face Representations across Different Images , 2009, Journal of Cognitive Neuroscience.

[3]  Mark J. Buller,et al.  The comfort, acceptability and accuracy of energy expenditure estimation from wearable ambulatory physical activity monitoring systems in soldiers , 2017 .

[4]  Guang-Zhong Yang,et al.  A Flexible, Low Noise Reflective PPG Sensor Platform for Ear-Worn Heart Rate Monitoring , 2009, 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks.

[5]  José del R. Millán,et al.  Brain-Controlled Wheelchairs: A Robotic Architecture , 2013, IEEE Robotics & Automation Magazine.

[6]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[7]  Danilo P. Mandic,et al.  Smart Helmet: Wearable Multichannel ECG and EEG , 2016, IEEE Journal of Translational Engineering in Health and Medicine.

[8]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  K. Lafleur,et al.  Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface , 2013, Journal of neural engineering.

[10]  Tzyy-Ping Jung,et al.  A Practical Mobile Dry EEG System for Human Computer Interfaces , 2013, HCI.

[11]  Enzo Mastinu,et al.  Analog front-ends comparison in the way of a portable, low-power and low-cost EMG controller based on pattern recognition , 2015, 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[12]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.

[13]  Scott P. Johnson,et al.  A Critical Test of Temporal and Spatial Accuracy of the Tobii T60XL Eye Tracker. , 2012, Infancy : the official journal of the International Society on Infant Studies.

[14]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[15]  Shashi K. Jasra,et al.  Identifying correlation between facial expression and heart rate and skin conductance with iMotions biometric platform , 2017 .

[16]  Frans W Cornelissen,et al.  The Eyelink Toolbox: Eye tracking with MATLAB and the Psychophysics Toolbox , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[17]  Tzyy-Ping Jung,et al.  Online recursive independent component analysis for real-time source separation of high-density EEG , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[18]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[19]  J. Polich Updating P300: An integrative theory of P3a and P3b , 2007, Clinical Neurophysiology.

[20]  Luc Van Gool,et al.  The Pascal Visual Object Classes (VOC) Challenge , 2010, International Journal of Computer Vision.

[21]  Heng Tao Shen,et al.  Principal Component Analysis , 2009, Encyclopedia of Biometrics.

[22]  Ali Farhadi,et al.  You Only Look Once: Unified, Real-Time Object Detection , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[23]  Benjamin Blankertz,et al.  Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP) , 2013, Clinical Neurophysiology.

[24]  Douglas G. Altman,et al.  Measurement in Medicine: The Analysis of Method Comparison Studies , 1983 .

[25]  Tzyy-Ping Jung,et al.  Multi-modal Approach for Affective Computing , 2018, 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[26]  Panagiotis D. Bamidis,et al.  Affective computing in the era of contemporary neurophysiology and health informatics , 2004, Interact. Comput..

[27]  Takumi Ichimura,et al.  Emotion Analyzing Method Using Physiological State , 2004, KES.

[28]  Rosalind W. Picard,et al.  Motion-tolerant magnetic earring sensor and wireless earpiece for wearable photoplethysmography , 2010, IEEE Transactions on Information Technology in Biomedicine.

[29]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[30]  Thierry Dutoit,et al.  Performance of the Emotiv Epoc headset for P300-based applications , 2013, Biomedical engineering online.

[31]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[32]  D. Ewing,et al.  New method for assessing cardiac parasympathetic activity using 24 hour electrocardiograms. , 1984, British heart journal.

[33]  Xiaogang Chen,et al.  Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain–computer interface , 2015, Journal of neural engineering.

[34]  Blaine A. Price,et al.  Wearables: has the age of smartwatches finally arrived? , 2015, Commun. ACM.

[35]  Wei Liu,et al.  Multimodal Emotion Recognition Using Multimodal Deep Learning , 2016, ArXiv.

[36]  Febriliyan Samopa,et al.  Evaluating OpenBCI Spiderclaw V1 Headwear's Electrodes Placements for Brain-Computer Interface (BCI) Motor Imagery Application , 2015 .

[37]  Tzyy-Ping Jung,et al.  An Affordable Bio-Sensing and Activity Tagging Platform for HCI Research , 2017, HCI.

[38]  Tzyy-Ping Jung,et al.  Independent Component Analysis of Electroencephalographic Data , 1995, NIPS.

[39]  T. Jung,et al.  Dry and Noncontact EEG Sensors for Mobile Brain–Computer Interfaces , 2012, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[40]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[41]  Tzyy-Ping Jung,et al.  High-speed spelling with a noninvasive brain–computer interface , 2015, Proceedings of the National Academy of Sciences.

[42]  Fausto Orsi Medola,et al.  The influence of axle position and the use of accessories on the activity of upper limb muscles during manual wheelchair propulsion , 2018, International journal of occupational safety and ergonomics : JOSE.

[43]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[44]  Mariano Sigman,et al.  Fixation-related potentials in visual search: a combined EEG and eye tracking study. , 2012, Journal of vision.

[45]  B. Widrow,et al.  Adaptive noise cancelling: Principles and applications , 1975 .

[46]  Fred R. Beyette,et al.  Modular, bluetooth enabled, wireless electroencephalograph (EEG) platform , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[47]  Bertram E. Shi,et al.  Convolutional Neural Network for Target Face Detection using Single-trial EEG Signal , 2018, 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).