A linear regression model to detect user emotion for touch input interactive systems

Human emotion plays significant role is affecting our reasoning, learning, cognition and decision making, which in turn may affect usability of interactive systems. Detection of emotion of interactive system users is therefore important, as it can help design for improved user experience. In this work, we propose a model to detect the emotional state of the users of touch screen devices. Although a number of methods were developed to detect human emotion, those are computationally intensive and require setup cost. The model we propose aims to avoid these limitations and make the detection process viable for mobile platforms. We assume three emotional states of a user: positive, negative and neutral. The touch interaction is characterized by a set of seven features, derived from the finger strokes and taps. Our proposed model is a linear combination of these features. The model is developed and validated with empirical data involving 57 participants performing 7 touch input tasks. The validation study demonstrates a high prediction accuracy of 90.47%.

[1]  M. Sasikumar,et al.  Recognising Emotions from Keyboard Stroke Pattern , 2010 .

[2]  Yuan Gao,et al.  What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay? , 2012, TCHI.

[3]  Mohammad Soleymani,et al.  Single Trial Classification of EEG and Peripheral Physiological Signals for Recognition of Emotions Induced by Music Videos , 2010, Brain Informatics.

[4]  Richard L. Hazlett,et al.  Measuring emotional valence during interactive experiences: boys at video game play , 2006, CHI.

[5]  Russell Beale,et al.  Affect and Emotion in Human-Computer Interaction, From Theory to Applications , 2008, Affect and Emotion in Human-Computer Interaction.

[6]  Christoph Bartneck,et al.  Subtle emotional expressions of synthetic characters , 2005, Int. J. Hum. Comput. Stud..

[7]  Mikko Sams,et al.  The effect of dynamics on identifying basic emotions from synthetic and natural faces , 2008, Int. J. Hum. Comput. Stud..

[8]  Regan L. Mandryk,et al.  Identifying emotional states using keystroke dynamics , 2011, CHI.

[9]  M. Chignell,et al.  Affective Interaction Understanding, Evaluating, and Designing for Human Emotion , 2011 .

[10]  Rafael A. Calvo,et al.  Detecting Naturalistic Expressions of Nonbasic Affect Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[11]  Andrea Kleinsmith,et al.  A categorical approach to affective gesture recognition , 2003, Connect. Sci..

[12]  Kristina Höök,et al.  The sensual evaluation instrument: developing an affective evaluation tool , 2006, CHI.

[13]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[14]  Peter F. Driessen,et al.  Gesture-Based Affective Computing on Motion Capture Data , 2005, ACII.

[15]  P. Ekman An argument for basic emotions , 1992 .

[16]  J. Russell A circumplex model of affect. , 1980 .

[17]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[18]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[19]  Rosalind W. Picard Affective Computing , 1997 .

[20]  Kazuhiko Takahashi Remarks on Emotion Recognition from Bio-Potential Signals , 2004 .

[21]  Antonio Camurri,et al.  Toward a Minimal Representation of Affective Gestures , 2011, IEEE Transactions on Affective Computing.

[22]  P. Ekman,et al.  Pan-Cultural Elements in Facial Displays of Emotion , 1969, Science.