Multimodal object oriented user interfaces in mobile affective interaction

In this paper, we investigate an object oriented (OO) architecture for multimodal emotion recognition in interactive applications through mobile phones or handheld devices. Mobile phones are different from desktop computers since mobile phones are not performing any processing involving emotion recognition whereas desktop computers can perform such processing. In fact, in our approach, mobile phones have to pass all data collected to a server and then perform emotion recognition. The object oriented architecture that we have created, combines evidence from multiple modalities of interaction, namely the mobile device’s keyboard and the mobile device’s microphone, as well as data from emotion stereotypes. Moreover, the OO method classifies them into well structured objects with their own properties and methods. The resulting emotion detection server is capable of using and handling transmitted information from different mobile sources of multimodal data during human-computer interaction. As a test bed for the affective mobile interaction we have used an educational application that is incorporated into the mobile system.

[1]  Sharon Oviatt,et al.  User-centered modeling and evaluation of multimodal interfaces , 2003, Proc. IEEE.

[2]  Maria Virvou,et al.  On assisting a visual-facial affect recognition system with keyboard-stroke pattern information , 2010, Knowl. Based Syst..

[3]  Zhihong Zeng,et al.  Audio-Visual Affect Recognition , 2007, IEEE Transactions on Multimedia.

[4]  Peter Robinson,et al.  Interactive control of music using emotional body expressions , 2008, CHI Extended Abstracts.

[5]  Anna Esposito,et al.  The Perceptual and Cognitive Role of Visual and Auditory Channels in Conveying Emotional Information , 2009, Cognitive Computation.

[6]  Vicente Pelechano,et al.  The OO-method approach for information systems modeling: from object-oriented conceptual modeling to automated programming , 2001, Inf. Syst..

[7]  Maria Virvou,et al.  Emotional Intelligence in Multimodal Object Oriented User Interfaces , 2009, KES IIMSS.

[8]  Chi-Chun Lo,et al.  Integrating Semantic Web and Object-Oriented Programming for Cooperative Design , 2009, J. Univers. Comput. Sci..

[9]  Jane Vincent Affiliations, Emotion and the Mobile Phone , 2008, COST 2102 Conference.

[10]  Judy Kay,et al.  Intelligent Tutoring Systems , 2000, Lecture Notes in Computer Science.

[11]  Ce-Kuen Shieh,et al.  An object-oriented approach to develop software fault-tolerant mechanisms for parallel programming systems , 1996, J. Syst. Softw..

[12]  Youngwoo Park,et al.  CheekTouch: an affective interaction technique while speaking on the mobile phone , 2010, CHI Extended Abstracts.

[13]  Ching-Lai Hwang,et al.  Fuzzy Multiple Attribute Decision Making - Methods and Applications , 1992, Lecture Notes in Economics and Mathematical Systems.

[14]  U. Benz,et al.  Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information , 2004 .

[15]  Zhiwei Zhu,et al.  Toward a decision-theoretic framework for affect recognition and user assistance , 2006, Int. J. Hum. Comput. Stud..

[16]  Martin Gogolla Unified Modeling Language , 2009, Encyclopedia of Database Systems.

[17]  H. Kunzi,et al.  Lectu re Notes in Economics and Mathematical Systems , 1975 .

[18]  Maria Virvou,et al.  On Improving Visual-Facial Emotion Recognition with Audio-lingual and Keyboard Stroke Pattern Information , 2008, 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology.

[19]  Maria Virvou,et al.  Combining Empirical Studies of Audio-Lingual and Visual-Facial Modalities for Emotion Recognition , 2007, KES.

[20]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[21]  Maria Virvou,et al.  Mobile versus desktop facilities for an e-learning system: users' perspective , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[22]  Ching-Lai Hwang,et al.  Multiple Attribute Decision Making: Methods and Applications - A State-of-the-Art Survey , 1981, Lecture Notes in Economics and Mathematical Systems.

[23]  Deborah J. Armstrong The quarks of object-oriented development , 2006, CACM.

[24]  Tsuyoshi Moriyama,et al.  Measurement of human vocal emotion using fuzzy control , 2001, Systems and Computers in Japan.

[25]  Christine L. Lisetti,et al.  MAUI avatars: Mirroring the user's sensed emotions via expressive multi-ethnic facial avatars , 2006, J. Vis. Lang. Comput..

[26]  Tsutomu Miyasato,et al.  Bimodal Emotion Recognition by Man and Machine , 2007 .

[27]  Elaine Rich,et al.  Users are Individuals: Individualizing User Models , 1999, Int. J. Man Mach. Stud..

[28]  Elaine Rich Users are individuals: individualizing user models , 1999, Int. J. Hum. Comput. Stud..

[29]  Kyu-Sik Park,et al.  A Study of Speech Emotion Recognition and Its Application to Mobile Services , 2007, UIC.

[30]  Maria Virvou,et al.  Knowledge Engineering for Affective Bi-Modal Interaction in Mobile Devices , 2008, JCKBSE.

[31]  Maria Virvou,et al.  Emotional Intelligence: Constructing User Stereotypes for Affective Bi-modal Interaction , 2006, KES.

[32]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[33]  Efthymios Alepis,et al.  Requirements Analysis and Design of an Affective Bi-Modal Intelligent Tutoring System: The Case of Keyboard and Microphone , 2008 .

[34]  Ruth Aylett,et al.  Feel the Difference: A Guide with Attitude! , 2007, IVA.

[35]  Maria Virvou,et al.  Object oriented architecture for affective multimodal e-learning interfaces , 2010, Intell. Decis. Technol..

[36]  L. de Silva,et al.  Facial emotion recognition using multi-modal information , 1997, Proceedings of ICICS, 1997 International Conference on Information, Communications and Signal Processing. Theme: Trends in Information Systems Engineering and Wireless Multimedia Communications (Cat..

[37]  Loïc Kessous,et al.  Multimodal user’s affective state analysis in naturalistic interaction , 2010, Journal on Multimodal User Interfaces.

[38]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[39]  Severin Bumbaru,et al.  Toward Emotional E-Commerce: The Customer Agent , 2008, KES.

[40]  Peter C. Fishburn,et al.  Letter to the Editor - Additive Utilities with Incomplete Product Sets: Application to Priorities and Assignments , 1967, Oper. Res..

[41]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[42]  Maria Virvou,et al.  Mobile educational features in authoring tools for personalised tutoring , 2005, Comput. Educ..