ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework

The computer's ability to recognize human emotional states given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools, which allow systems developers to easily integrate emotion recognition into their software projects. The work reported here offers a first step to fill this gap in the lack of frameworks and models, addressing: (a) the modeling of an agent-driven component-based architecture for multimodal emotion recognition, called ABE, and (b) the use of ABE to implement a multimodal emotion recognition framework to support third-party systems becoming empathetic systems.

[1]  Winslow Burleson,et al.  Affective computing meets design patterns: a pattern-based model for a multimodal emotion recognition framework , 2011, EuroPLoP.

[2]  H. Lüders,et al.  American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature , 1991, Journal of clinical neurophysiology : official publication of the American Electroencephalographic Society.

[3]  Alan J. Dix,et al.  Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me (ACE) , 2005, DiGRA Conference.

[4]  Rosalind W. Picard,et al.  Automated Posture Analysis for Detecting Learner's Interest Level , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[5]  Peter Robinson,et al.  Generalization of a Vision-Based Computational Model of Mind-Reading , 2005, ACII.

[6]  Marc Schröder,et al.  The SEMAINE API: Towards a Standards-Based Framework for Building Emotion-Oriented Systems , 2010, Adv. Hum. Comput. Interact..

[7]  Ivon Arroyo,et al.  Emotional intelligence for computer tutors , 2008 .

[8]  Bruce W. Weide,et al.  Micro-Architecture vs. Macro-Architecture , 1994 .

[9]  Rosalind W. Picard Affective Computing , 1997 .

[10]  Carson Reynolds,et al.  The HandWave Bluetooth Skin Conductance Sensor , 2005, ACII.

[11]  Hamideh Afsarmanesh,et al.  Distributed objects in a federation of autonomous cooperating agents , 1993, [1993] Proceedings International Conference on Intelligent and Cooperative Information Systems.

[12]  Arthur C. Graesser,et al.  Toward an Affect-Sensitive AutoTutor , 2007, IEEE Intelligent Systems.

[13]  Victor R. Lesser,et al.  A survey of multi-agent organizational paradigms , 2004, The Knowledge Engineering Review.

[14]  Ken Perlin,et al.  A platform for affective agent research , 2004 .

[15]  Ralph E. Johnson,et al.  Design Patterns: Abstraction and Reuse of Object-Oriented Design , 1993, ECOOP.

[16]  Ralph E. Johnson,et al.  Components, frameworks, patterns , 1997, SSR '97.

[17]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[18]  Yuan Qi,et al.  Context-sensitive Bayesian classifiers and application to mouse pressure pattern classification , 2002, Object recognition supported by user interaction for service robots.

[19]  Scott A. DeLoach,et al.  An Overview of the Multiagent Systems Engineering Methodology , 2000, AOSE.

[20]  Kasia Muldner,et al.  Emotion Sensors Go To School , 2009, AIED.

[21]  Xu Chao,et al.  A Trusted Affective Model Approach to Proactive Health Monitoring System , 2008, 2008 International Seminar on Future BioMedical Information Engineering.