Design, Implementation and Evaluation of a Multimodal Sensor System Integrated Into an Airplane Seat

Air travel has become the preferred mode of long-distance transportation for most of the world’s travelers. People of every age group and health status are travelling by airplane and thus the airplane has become part of our environment, in which passengers could benefit from assistive support. In this regard, the European research project SEAT has investigated sensor technologies to provide assistive support related to the health and well-being of airplane passengers. Since the main interaction point between a passenger and the airplane is the seat, a seat-integrated sensor system was developed to measure health and affectrelated signals of a passenger. The measured signals include the electrocardiogram (ECG), electrodermal activity (EDA), skin temperature, respiration as well as movement of the passenger. In this chapter we describe the design, implementation and evaluation of the seat-integrated sensor system. In particular we highlight two approaches of sensor fusion in order to appraise the signal quality in an airplane scenario and to identify the passenger’s affective state. In the first part, we show how the design of the seat-integrated sensor system is influenced by the trade-off between sensor comfort and signal quality: To achieve the acceptance and hence the use of the system, the sensors need to be attached in a comfortable and nonobtrusive way or even be totally integrated into the seat. On the other hand, a comfortoptimized sensor placement usually limits the signal quality. We argue that not only the development of comfortable and reliable sensor technology but also the quality appraisal of the data generated by the sensors needs to be addressed. Artifact detection through sensor fusion is presented and the working principle is shown in a feasibility study, in which normal passenger activities were performed. Based on the presented method, we are able to identify signal regions in which the accuracies for detecting the heart-rate is 88% compared to 40% without any artifact removal [Schumm et al., 2010]. In the second part, we will explain another sensor fusion approach in the context of emotion recognition. Previous work on emotion recognition from physiology has rarely addressed the problem of missing data. However, data loss due to artifacts is a frequent phenomenon in practical applications. Discarding the whole data instance if only a part is corrupted results in a substantial loss of data. To address this problem, we investigated two methods

[1]  Gerhard Tröster,et al.  Using ensemble classifier systems for handling missing data in emotion recognition from physiology: One step towards a practical system , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[2]  M Steffen,et al.  Mobile Noncontact Monitoring of Heart and Lung Activity , 2007, IEEE Transactions on Biomedical Circuits and Systems.

[3]  Kornel Laskowski,et al.  Emotion recognition in spontaneous speech using GMMs , 2006, INTERSPEECH.

[4]  Gerhard Tröster,et al.  Unobtrusive physiological monitoring in an airplane seat , 2009, Personal and Ubiquitous Computing.

[5]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[6]  John J. B. Allen,et al.  The handbook of emotion elicitation and assessment , 2007 .

[7]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[8]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.