A Novel Method to Build and Validate an Affective State Prediction Model from Touch-Typing

Affective systems are supposed to improve user satisfaction and hence usability by identifying and complementing the affective state of a user at the time of interaction. The first and most important challenge for building such systems is to identify the affective state in a systematic way. This is generally done based on computational models. Building such models requires affective data. In spite of the extensive growth in this research area, there are a number of challenges in affect induction method for collecting the affective data as well as for building models for real-time prediction of affective states. In this article, we have reported a novel method for inducing particular affective states to unobtrusively collect the affective data as well as a minimalist model to predict the affective states of a user from her/his typing pattern on a touchscreen of a smartphone. The prediction accuracy for our model was 86.60%. The method for inducing the specific affective states and the model to predict these states are validated through empirical studies comprising EEG signals of twenty two participants.

[1]  Constantinos Patsakis,et al.  A survey on mobile affective computing , 2017, Comput. Sci. Rev..

[2]  Antonio Camurri,et al.  Toward a Minimal Representation of Affective Gestures , 2011, IEEE Transactions on Affective Computing.

[3]  Mehdi Ammi,et al.  Gestural and Postural Reactions to Stressful Event: Design of a Haptic Stressful Stimulus , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[4]  Roddy Cowie,et al.  Emotional speech: Towards a new generation of databases , 2003, Speech Commun..

[5]  Andrea Kleinsmith,et al.  A categorical approach to affective gesture recognition , 2003, Connect. Sci..

[6]  Subrata Tikadar,et al.  A Minimalist Approach for Identifying Affective States for Mobile Interaction Design , 2017, INTERACT.

[7]  S. Debener,et al.  How about taking a low-cost, small, and wireless EEG for a walk? , 2012, Psychophysiology.

[8]  Antonio Fernández-Caballero,et al.  A Review on the Role of Color and Light in Affective Computing , 2015 .

[9]  T. Eerola,et al.  A comparison of the discrete and dimensional models of emotion in music , 2011 .

[10]  Stacy Marsella,et al.  Felt emotion and social context determine the intensity of smiles in a competitive video game , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[11]  P. Terry,et al.  Distinctions between emotion and mood , 2005 .

[12]  J. Polivy,et al.  On the induction of emotion in the laboratory: discrete moods or multiple affect states? , 1981, Journal of personality and social psychology.

[13]  Kannan Venkataramanan,et al.  Emotion Recognition from Speech , 2019, ArXiv.

[14]  Roddy Cowie,et al.  Real life emotions in French and English TV video clips: an integrated annotation protocol combining continuous and discrete approaches , 2006, LREC.

[15]  M. Bradley,et al.  Looking at pictures: affective, facial, visceral, and behavioral reactions. , 1993, Psychophysiology.

[16]  Beverly Park Woolf,et al.  Affect-aware tutors: recognising and responding to student affect , 2009, Int. J. Learn. Technol..

[17]  G. L. Collier Beyond valence and activity in the emotional connotations of music , 2007 .

[18]  Javier R. Movellan,et al.  The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions , 2014, IEEE Transactions on Affective Computing.

[19]  A. Gabrielsson Emotion perceived and emotion felt: Same or different? , 2001 .

[20]  P. Philippot Inducing and assessing differentiated emotion-feeling states in the laboratory. , 1993, Cognition & emotion.

[21]  Katarzyna Wac,et al.  iSensestress: Assessing stress through human-smartphone interaction analysis , 2015, 2015 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth).

[22]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[23]  Kostas Karpouzis,et al.  The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data , 2007, ACII.

[24]  P. Ekman,et al.  Pan-Cultural Elements in Facial Displays of Emotion , 1969, Science.

[25]  Andrew Sears,et al.  Automated stress detection using keystroke and linguistic features: An exploratory study , 2009, Int. J. Hum. Comput. Stud..

[26]  Maureen C. Stone,et al.  Affective Color in Visualization , 2017, CHI.

[27]  Md. Kamrul Hasan,et al.  Identifying emotion by keystroke dynamics and text pattern analysis , 2014, Behav. Inf. Technol..

[28]  Aladdin Ayesh,et al.  The Effects of Typing Demand on Emotional Stress, Mouse and Keystroke Behaviours , 2015 .

[29]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[30]  D. Västfjäll Emotion induction through music: A review of the musical mood induction procedure , 2001 .

[31]  M. Sasikumar,et al.  Recognising Emotions from Keyboard Stroke Pattern , 2010 .

[32]  P. Zimmermann,et al.  Affective Computing—A Rationale for Measuring Mood With Mouse and Keyboard , 2003, International journal of occupational safety and ergonomics : JOSE.

[33]  Iain R. Murray,et al.  Toward the simulation of emotion in synthetic speech: a review of the literature on human vocal emotion. , 1993, The Journal of the Acoustical Society of America.

[34]  Richard L. Hazlett,et al.  Measuring emotional valence during interactive experiences: boys at video game play , 2006, CHI.

[35]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[36]  Hosub Lee,et al.  Towards unobtrusive emotion recognition for affective social communication , 2012, 2012 IEEE Consumer Communications and Networking Conference (CCNC).

[37]  Jennifer Healey,et al.  Detecting stress during real-world driving tasks using physiological sensors , 2005, IEEE Transactions on Intelligent Transportation Systems.

[38]  Tzu-Chien Hsiao,et al.  The influence of emotion on keyboard typing: an experimental study using visual stimuli , 2014, Biomedical engineering online.

[39]  Thomas Plötz,et al.  Let's (not) stick together: pairwise similarity biases cross-validation in activity recognition , 2015, UbiComp.

[40]  J. Russell A circumplex model of affect. , 1980 .

[41]  Regan L. Mandryk,et al.  Identifying emotional states using keystroke dynamics , 2011, CHI.

[42]  Akane Sano,et al.  Stress Recognition Using Wearable Sensors and Mobile Phones , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.