Advanced Human Affect Visualization

Affective Computing researches use to be focused in the automatic extraction of human emotions and in increasing the success rates in the emotion recognition task. However, there is a lack of automatic tools that intuitively visualize the users' emotional information. In this paper, the development of a novel tool that allows the visualization of contents, user emotions and gaze at a glance is presented. The tool, called Emotracker, is based on the combination of eye tracking and facial emotional recognition technologies, and offers a wide range of visualization options, including emotional saccade maps and emotional heat maps. The simultaneous consideration of gaze and emotion opens the door to the evaluation of user engagement in a wide range of applications.

[1]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[2]  Björn W. Schuller,et al.  Categorical and dimensional affect analysis in continuous input: Current trends and future directions , 2013, Image Vis. Comput..

[3]  Zakia Hammal,et al.  Facial expression classification: An approach based on the fusion of facial deformations using the transferable belief model , 2007, Int. J. Approx. Reason..

[4]  R. Plutchik Emotion, a psychoevolutionary synthesis , 1980 .

[5]  T. Dalgleish,et al.  Handbook of cognition and emotion , 1999 .

[6]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[7]  Scotty D. Craig,et al.  AutoTutor Detects and Responds to Learners Affective and Cognitive States , 2008 .

[8]  Mounia Lalmas,et al.  On saliency, affect and focused attention , 2012, CHI.

[9]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[10]  M. den Uyl,et al.  The FaceReader: Online facial expression recognition , 2006 .

[11]  Richard L. Hazlett,et al.  Emotional response to television commercials: Facial EMG vs. self-report , 1999 .

[12]  Cynthia Whissell,et al.  THE DICTIONARY OF AFFECT IN LANGUAGE , 1989 .

[13]  P. Ekman,et al.  Facial signs of emotional experience. , 1980 .

[14]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[15]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[16]  Daniel McDuff,et al.  Crowdsourcing Facial Responses to Online Videos , 2012, IEEE Transactions on Affective Computing.

[17]  J. Russell A circumplex model of affect. , 1980 .

[18]  Regan L. Mandryk,et al.  A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies , 2007, Int. J. Hum. Comput. Stud..

[19]  Eva Cerezo,et al.  From a Discrete Perspective of Emotions to Continuous, Dynamic, and Multimodal Affect Sensing , 2015 .