On assisting a visual-facial affect recognition system with keyboard-stroke pattern information

Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of six basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.

[1]  Ioanna-Ourania Stathopoulou,et al.  Evaluation of the discrimination power of features extracted from 2-D and 3-D facial images for facial expression analysis , 2005, 2005 13th European Signal Processing Conference.

[2]  Tsutomu Miyasato,et al.  Multimodal human emotion/expression recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[3]  Maria Virvou,et al.  Combining Empirical Studies of Audio-Lingual and Visual-Facial Modalities for Emotion Recognition , 2007, KES.

[4]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[5]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[6]  Efthymios Alepis,et al.  Requirements Analysis and Design of an Affective Bi-Modal Intelligent Tutoring System: The Case of Keyboard and Microphone , 2008 .

[7]  L. de Silva,et al.  Facial emotion recognition using multi-modal information , 1997, Proceedings of ICICS, 1997 International Conference on Information, Communications and Signal Processing. Theme: Trends in Information Systems Engineering and Wireless Multimedia Communications (Cat..

[8]  Sharon Oviatt,et al.  User-centered modeling and evaluation of multimodal interfaces , 2003, Proc. IEEE.

[9]  Ioanna-Ourania Stathopoulou,et al.  An improved neural-network-based face detection and facial expression classification system , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[10]  Volker Strom,et al.  Visual prosody: facial movements accompanying speech , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[11]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[12]  Zhiwei Zhu,et al.  Toward a decision-theoretic framework for affect recognition and user assistance , 2006, Int. J. Hum. Comput. Stud..

[13]  Kristina Höök,et al.  Evaluating affective interactions , 2007, Int. J. Hum. Comput. Stud..

[14]  Peter Robinson,et al.  Generalization of a Vision-Based Computational Model of Mind-Reading , 2005, ACII.

[15]  Ioanna-Ourania Stathopoulou,et al.  Facial Expression Classification: Specifying Requirements for an Automated System , 2006, KES.

[16]  G.A. Tsihrintzis,et al.  Detection and expression classification systems for face images (FADECS) , 2005, IEEE Workshop on Signal Processing Systems Design and Implementation, 2005..

[17]  Ioanna-Ourania Stathopoulou,et al.  Comparative Performance Evaluation of Artificial Neural Network-Based vs. Human Facial Expression Classifiers for Facial Expression Recognition , 2008, New Directions in Intelligent Interactive Multimedia.

[18]  Ching-Lai Hwang,et al.  Fuzzy Multiple Attribute Decision Making - Methods and Applications , 1992, Lecture Notes in Economics and Mathematical Systems.

[19]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[20]  Georgios Paliouras,et al.  Web Usage Mining as a Tool for Personalization: A Survey , 2003, User Modeling and User-Adapted Interaction.

[21]  Andrea Kleinsmith,et al.  A categorical approach to affective gesture recognition , 2003, Connect. Sci..

[22]  Roddy Cowie,et al.  Automatic statistical analysis of the signal and prosodic signs of emotion in speech , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.

[23]  K. Scherer,et al.  Handbook of affective sciences. , 2003 .

[24]  Maria Virvou,et al.  Emotional Intelligence: Constructing User Stereotypes for Affective Bi-modal Interaction , 2006, KES.

[25]  Richard J. Davidson,et al.  Parsing the subcomponents of emotion and disorders of emotion: Perspectives from affective neuroscience. , 2003 .

[26]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[27]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[28]  Ioanna-Ourania Stathopoulou,et al.  NEU-FACES: A Neural Network-Based Face Image Analysis System , 2007, ICANNGA.

[29]  Maria Virvou,et al.  Affective Student Modeling Based on Microphone and Keyboard User Actions , 2006 .

[30]  H. Kunzi,et al.  Lectu re Notes in Economics and Mathematical Systems , 1975 .

[31]  Lakhmi C. Jain,et al.  New Directions in Intelligent Interactive Multimedia , 2008, New Directions in Intelligent Interactive Multimedia.

[32]  Klaus R. Scherer,et al.  Adding the affective dimension: a new look in speech analysis and synthesis , 1996, ICSLP.

[33]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[34]  A. Damasio,et al.  Emotion in the perspective of an integrated nervous system 1 Published on the World Wide Web on 27 January 1998. 1 , 1998, Brain Research Reviews.

[35]  Peter C. Fishburn,et al.  Letter to the Editor - Additive Utilities with Incomplete Product Sets: Application to Priorities and Assignments , 1967, Oper. Res..