What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?

1 What does touch tell us about emotions in touchscreen-based gameplay? Nowadays, more and more people play games on touch screen mobile phones. This phenomenon raises a very interesting question: does touch behavior reflect the player's emotional state? If possible, this would be a valuable evaluation indicator for game designers but also for realtime personalization of the game experience. Psychology studies on acted touch behaviour show the existence of discriminative affective profiles. In this paper, finger stroke features during gameplay on an iPod were extracted and their discriminative power analysed. A system was built and tested to recognize four emotional states (Excited, Relaxed, Frustrated and Bored) and 2 levels of arousal and two of valence. Discriminant Analysis of the collected data shows that pressure features discriminate frustration states from the other three states. Stroke speed and directness features discriminate between different levels of arousal whilst stroke length features discriminate mainly boredom from a relaxed state. Three machine learning algorithms were used to build a person-independent automatic emotion recognition system based on touch behaviour. All three algorithms produced very interesting results in discriminating between 4 emotional states reaching between 69% and 77% of correct recognition. Higher results (~89%) were obtained for discriminating between two levels of arousal and between two levels of valence. These results highlight the potential of using touch behaviour as a non-obstructive way to measure users' emotional states in contexts where touch-based devices are used.

[1]  R. Laban,et al.  The mastery of movement , 1950 .

[2]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[3]  M Clynes Sentography: dynamic forms of communication of emotion and qualities. , 1973, Computers in biology and medicine.

[4]  J. Russell A circumplex model of affect. , 1980 .

[5]  Stanley E. Jones,et al.  A naturalistic study of the meanings of touch , 1985 .

[6]  K. Scherer,et al.  Cues and channels in emotion recognition. , 1986 .

[7]  Susan Leigh Star,et al.  Institutional Ecology, `Translations' and Boundary Objects: Amateurs and Professionals in Berkeley's Museum of Vertebrate Zoology, 1907-39 , 1989 .

[8]  Basil Bernstein,et al.  'Pedagogy, Identity and the Construction of a Theory of Symbolic Control': Basil Bernstein questioned by Joseph Solomon , 1999 .

[9]  Cathleen A. Towey Flow , 2000, My Wilderness.

[10]  Mike Sharples,et al.  The design of personal mobile technologies for lifelong learning , 2000, Comput. Educ..

[11]  Ian Oakley,et al.  Communicating with Feeling , 2000, Haptic Human-Computer Interaction.

[12]  J. Stratford,et al.  Student perceptions of a virtual field trip to replace a real field trip , 2001, J. Comput. Assist. Learn..

[13]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[14]  Takanori Shibata,et al.  Mental commit robot and its application to therapy of children , 2001, 2001 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Proceedings (Cat. No.01TH8556).

[15]  Gayle Morris,et al.  Ontological performance: Bodies, identities and learning , 2001 .

[16]  Cassandra Drennon Negotiating Power and Politics in Practitioner Inquiry Communities. , 2002 .

[17]  Hillary Anger Elfenbein,et al.  Is there an in-group advantage in emotion recognition? , 2002, Psychological bulletin.

[18]  Hiroshi Ishii,et al.  ComTouch: design of a vibrotactile communication device , 2002, DIS '02.

[19]  Antonio Camurri,et al.  Multimodal Analysis of Expressive Gesture in Music and Dance Performances , 2003, Gesture Workshop.

[20]  Antonio Camurri,et al.  Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library , 2003, Gesture Workshop.

[21]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[22]  Geoff Jones,et al.  Spacing and Timing , 2004 .

[23]  D. Magnusson,et al.  The psychobiology of emotion: the role of the oxytocinergic system , 2005, International journal of behavioral medicine.

[24]  Ann Blandford,et al.  Social empowerment and exclusion: A case study on digital libraries , 2005, TCHI.

[25]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[26]  Peter F. Driessen,et al.  Gesture-Based Affective Computing on Motion Capture Data , 2005, ACII.

[27]  Anne Adams,et al.  Exploring the benefits of user education: a review of three case studies. , 2005, Health information and libraries journal.

[28]  Matt Jones,et al.  Autoethnography: a tool for practice and education , 2005, CHINZ '05.

[29]  Lorraine Johnston,et al.  Evaluation using cued-recall debrief to elicit information about a user's affective experiences , 2005, OZCHI.

[30]  Hiroshi Nakajima,et al.  We learn better together: enhancing eLearning with emotional characters , 2005, CSCL.

[31]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[32]  David Guile Access, learning and development in the creative and cultural sectors: from ‘creative apprenticeship’ to ‘being apprenticed’ , 2006 .

[33]  Robert F. Williams,et al.  Using Cognitive Ethnography to Study Instruction , 2006, ICLS.

[34]  Khalil Sima'an,et al.  Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship , 2006, Computational Linguistics.

[35]  Kristina Höök,et al.  Affective diary: designing for bodily expressiveness and self-reflection , 2006, CHI Extended Abstracts.

[36]  Leonardo Bonanni,et al.  TapTap: a haptic wearable for asynchronous distributed touch therapy , 2006, CHI Extended Abstracts.

[37]  Tracee Vetting Wolf,et al.  Dispelling "design" as the black art of CHI , 2006, CHI.

[38]  Reinhard Pekrun,et al.  Emotion in Education , 2007 .

[39]  Peter Robinson,et al.  Detecting Affect from Non-stylised Body Motions , 2007, ACII.

[40]  Georgios N. Yannakakis,et al.  TOWARDS OPTIMIZING ENTERTAINMENT IN COMPUTER GAMES , 2007, Appl. Artif. Intell..

[41]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[42]  Jeremy N. Bailenson,et al.  Virtual Interpersonal Touch: Expressing and Recognizing Emotions Through Haptic Devices , 2007, Hum. Comput. Interact..

[43]  Hairong Lv,et al.  Emotion recognition based on pressure sensor keyboards , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[44]  Rosemary Luckin,et al.  The learner centric ecology of resources: A framework for using technology to scaffold learning , 2008, Comput. Educ..

[45]  Yasuhiko Jimbo,et al.  Emotion Recognition of Finger Braille , 2008, 2008 International Conference on Intelligent Information Hiding and Multimedia Signal Processing.

[46]  Jie-xin Pu,et al.  A New Parameter Selection Method of Neural Network , 2008, 2008 International Conference on Multimedia and Ubiquitous Engineering (mue 2008).

[47]  James D. Hollan,et al.  Representational Gestures as Cognitive Artifacts for Developing Theories in a Scientific Laboratory , 2008, Theory in CSCW.

[48]  Gary Priestnall,et al.  Geo-contextualised visualisation for teaching and learning in the field , 2008 .

[49]  A. Adams,et al.  A qualititative approach to HCI research , 2008 .

[50]  Anton Nijholt,et al.  Turning Shortcomings into Challenges: Brain-Computer Interfaces for Games , 2009, INTETAIN.

[51]  Anton Nijholt,et al.  Movement-Based Sports Video Games: Investigating Motivation and Gaming Experience , 2009, Entertain. Comput..

[52]  Janet Palmer,et al.  Affective guidance of intelligent agents: How emotion controls cognition , 2009, Cognitive Systems Research.

[53]  L. Terveen,et al.  Becoming Wikipedian : Transformation of Participation in a Collaborative Online Encyclopedia , 2009 .

[54]  Cristina Conati,et al.  Empirically building and evaluating a probabilistic model of user affect , 2009, User Modeling and User-Adapted Interaction.

[55]  M. Brydon-Miller,et al.  Participatory action research: contributions to the development of practitioner inquiry in education , 2009 .

[56]  D. Keltner,et al.  The communication of emotion via touch. , 2009, Emotion.

[57]  Steve Benford,et al.  From interaction to trajectories: designing coherent journeys through user experiences , 2009, CHI.

[58]  Tiziana Ferrero-Regis Re-framing fashion : from original and copy to adaptation , 2010 .

[59]  Judith Good,et al.  Exploring affective technologies for the classroom with the subtle stone , 2010, CHI.

[60]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[61]  M. Sasikumar,et al.  Recognising Emotions from Keyboard Stroke Pattern , 2010 .

[62]  Dirk Heylen,et al.  Human-Computer Interaction for BCI Games: Usability and User Experience , 2010, 2010 International Conference on Cyberworlds.

[63]  G. Essick,et al.  Quantitative assessment of pleasant touch , 2010, Neuroscience & Biobehavioral Reviews.

[64]  Mark Gaved,et al.  More notspots than hotspots: strategies for undertaking networked learning in the real world , 2010 .

[65]  Karen E. Till,et al.  Ethnography and participant observation , 2010 .

[66]  Georgios N. Yannakakis,et al.  Towards affective camera control in games , 2010, User Modeling and User-Adapted Interaction.

[67]  Paul A. Cairns,et al.  Time perception, immersion and music in videogames , 2010, BCS HCI.

[68]  Thecla Schiphorst,et al.  Self-evidence: applying somatic connoisseurship to experience design , 2011, CHI Extended Abstracts.

[69]  Yvonne Rogers,et al.  Interaction design gone wild: striving for wild theory , 2011, INTR.

[70]  Nadia Bianchi-Berthouze,et al.  Automatic Recognition of Affective Body Movement in a Video Game Scenario , 2011, INTETAIN.

[71]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[72]  Yvonne Rogers,et al.  Rethinking 'multi-user': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface , 2011, CHI.

[73]  Julian Togelius,et al.  Experience-Driven Procedural Content Generation , 2011, IEEE Transactions on Affective Computing.

[74]  Yvonne Rogers,et al.  Being in the thick of in-the-wild studies: the challenges and insights of researcher participation , 2012, CHI.

[75]  Julian Togelius,et al.  Adapting Models of Visual Aesthetics for Personalized Content Creation , 2012, IEEE Transactions on Computational Intelligence and AI in Games.

[76]  Gary Priestnall,et al.  Of Catwalk Technologies and Boundary Creatures , 2013, TCHI.

[77]  Gary Priestnall,et al.  The Influence of Digital Surface Model Choice on Visibility-based Mobile Geospatial Applications , 2013, Trans. GIS.

[78]  Anne Adams Situated E-Learning: Empowerment and Barriers to Identity Changes , 2013 .

[79]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.