Towards in situ affect detection in mobile devices: a multimodal approach

Most of the research in multi-modal affect detection has been done in laboratory environment. Little work has been done for in situ affect detection. In this paper, we investigate affect detection in natural environment using sensors available in smart phones. We use facial expression and energy expenditure of a person to classify a person's affective state by continuously capturing fine grained accelerometer data for energy and camera image for facial expression and measure the performance of the system. We have deployed our system in natural environment and have provided special attention on annotation for the training data validating the 'ground truth'. We have found important correlation between facial image and energy which validates Russell's two dimensional theory of emotion using arousal and valence space. In this paper, we have presented initial findings in multi-modal affect detection.

[1]  Jennifer Healey,et al.  Recording Affect in the Field: Towards Methods and Metrics for Improving Ground Truth Labels , 2011, ACII.

[2]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[3]  Yuichi Fujiki,et al.  iPhone as a physical activity measurement platform , 2010, CHI Extended Abstracts.

[4]  M. Bradley,et al.  The pupil as a measure of emotional arousal and autonomic activation. , 2008, Psychophysiology.

[5]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[6]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[7]  J. Russell A circumplex model of affect. , 1980 .

[8]  Bernard Rimé,et al.  Long-lasting Cognitive and Social Consequences of Emotion: Social Sharing and Rumination , 1992 .

[9]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[10]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[11]  Takeo Kanade,et al.  Facial Expression Analysis , 2011, AMFG.

[12]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[13]  Flemming Hansen,et al.  Emotions, Advertising and Consumer Choice , 2007 .

[14]  Vladimir Pavlovic,et al.  Toward multimodal human-computer interface , 1998, Proc. IEEE.

[15]  Thomas Serre,et al.  Error weighted classifier combination for multi-modal human identification , 2004 .

[16]  Tanzeem Choudhury,et al.  Passive and In-Situ assessment of mental and physical well-being using mobile sensors , 2011, UbiComp '11.

[17]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[18]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[19]  Fernando De la Torre,et al.  Facial Expression Analysis , 2011, Visual Analysis of Humans.

[20]  Angeliki Metallinou,et al.  Decision level combination of multiple modalities for recognition and analysis of emotional expression , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[21]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[22]  Jennifer Healey,et al.  Out of the Lab and into the Fray: Towards Modeling Emotion in Everyday Life , 2010, Pervasive.

[23]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[24]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[25]  Konrad Paul Kording,et al.  Review TRENDS in Cognitive Sciences Vol.10 No.7 July 2006 Special Issue: Probabilistic models of cognition Bayesian decision theory in sensorimotor control , 2022 .

[26]  K. Prkachin,et al.  Eigenimage Based Pain Expression Recognition , 2007 .

[27]  Mark Weiser,et al.  Some computer science issues in ubiquitous computing , 1993, CACM.

[28]  P. Philippot,et al.  Social sharing of emotion : new evidence and new questions , 1998 .