Utilizing Emoticons on Mobile Devices within ESM studies to Measure Emotions in the Field

Assessing emotions in situ while people are using technology is a difficult endeavor. Several assumptions on the concept of emotion exist in the research field of HCI and similarly several methodological approaches how to measure them. In this position paper we propose the usage of emoticons on mobile devices within experience sampling method (ESM) studies to measure emotions in-situ while the mobile device is used. Since ESM studies often requires high efforts form the participant in terms of being interrupted several times a day it is especially important for ESM studies to have a means to be able to capture quick emotional user states and responses. We present a set of five emoticons, which cover two dimensions of emotions (strength and arousal) within one scale. To our conviction these emoticons allow an intuitive option for users to state their emotions on a mobile device during an ESM study. Using a case study, we investigated the feasibility of emoticons as answer categories for questions aiming at emotional states and feelings. We found that besides the space-saving aspect of the emoticons, which is an important aspect in conducting mobile studies on small displays, findings were not biased and participants had a positive user experience towards these question types. Furthermore, the usability of the emoticons was evaluated.

[1]  Richard L. Hazlett,et al.  Measuring emotional valence to understand the user's experience of software , 2007, Int. J. Hum. Comput. Stud..

[2]  P. Ekman Expression and the Nature of Emotion , 1984 .

[3]  Patrick Gomez,et al.  Respiratory responses during affective picture viewing , 2004, Biological Psychology.

[4]  G Demiris,et al.  Smart Homes and Ambient Assisted Living in an Aging Society , 2008, Methods of Information in Medicine.

[5]  B. J. Fogg,et al.  Persuasive technology: using computers to change what we think and do , 2002, UBIQ.

[6]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[7]  Jin Moen,et al.  From hand-held to body-worn: embodied experiences of the design and use of a wearable movement-based interaction concept , 2007, TEI.

[8]  Kornel Laskowski,et al.  Emotion recognition in spontaneous speech using GMMs , 2006, INTERSPEECH.

[9]  M. Dawson,et al.  The electrodermal system , 2007 .

[10]  P. Ekman Universal facial expressions of emotion. , 1970 .

[11]  C. Izard,et al.  Four systems for emotion activation: cognitive and noncognitive processes. , 1993, Psychological review.

[12]  Jaime C. Acosta,et al.  Responding to user emotional state by adding emotional coloring to utterances , 2009, INTERSPEECH.

[13]  Antonio Camurri,et al.  Technique for automatic emotion recognition by body gesture analysis , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[14]  L. Tarassenko,et al.  Characterizing artefact in the normal human 24-hour RR time series to aid identification and artificial replication of circadian variations in human beat to beat heart rate using a simple threshold , 2002, Computers in Cardiology.

[15]  Roland Göcke,et al.  EREC-II in Use - Studies on Usability and Suitability of a Sensor System for Affect Detection and Human Performance Monitoring , 2007, HCI.

[16]  A. Onwuegbuzie,et al.  Mixed Methods Research: A Research Paradigm Whose Time Has Come , 2004 .

[17]  Diane J. Litman,et al.  Investigating Human Tutor Responses to Student Uncertainty for Adaptive System Development , 2007, ACII.

[18]  Randolph R. Cornelius,et al.  The science of emotion: Research and tradition in the psychology of emotion. , 1997 .

[19]  P. Ekman,et al.  Approaches To Emotion , 1985 .

[20]  Pieter Desmet,et al.  Designing Products with Added Emotional Value: Development and Appllcation of an Approach for Research through Design , 2001 .

[21]  J. Russell,et al.  The psychology of facial expression: Frontmatter , 1997 .

[22]  Jennifer Healey,et al.  Detecting stress during real-world driving tasks using physiological sensors , 2005, IEEE Transactions on Intelligent Transportation Systems.

[23]  Jodi Forlizzi,et al.  ComSlipper: an expressive design to support awareness and availability , 2006, CHI Extended Abstracts.

[24]  Michael Minge,et al.  Erhebung emotionaler Aspekte bei der Interaktion mit technischen Systemen , 2005 .

[25]  J. Brannen Mixing Methods: The Entry of Qualitative and Quantitative Approaches into the Research Process , 2005 .

[26]  H. Reis Naturalistic approaches to studying social interaction , 1983 .

[27]  R. Polikar,et al.  Ensemble based systems in decision making , 2006, IEEE Circuits and Systems Magazine.

[28]  Marc Hassenzahl,et al.  What Needs Tell Us about User Experience , 2009, INTERACT.

[29]  Woontack Woo,et al.  Physiological Sensing and Feature Extraction for Emotion Recognition by Exploiting Acupuncture Spots , 2005, ACII.

[30]  Gerhard Tröster,et al.  Using ensemble classifier systems for handling missing data in emotion recognition from physiology: One step towards a practical system , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[31]  Rosalind W. Picard Affective Computing , 1997 .

[32]  Kennon M. Sheldon,et al.  What is satisfying about satisfying events? Testing 10 candidate psychological needs. , 2001, Journal of personality and social psychology.

[33]  Georgios N. Yannakakis,et al.  Entertainment modeling through physiology in physical play , 2008, Int. J. Hum. Comput. Stud..

[34]  Alex Pentland,et al.  Facial expression recognition using a dynamic model and motion energy , 1995, Proceedings of IEEE International Conference on Computer Vision.

[35]  John C. McCarthy,et al.  Technology as experience , 2004, INTR.

[36]  Evangelos Karapanos,et al.  User experience over time , 2008, CHI Extended Abstracts.

[37]  J. Russell A circumplex model of affect. , 1980 .

[38]  Virpi Roto,et al.  Understanding, scoping and defining user experience: a survey approach , 2009, CHI.

[39]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[40]  A. Barreto,et al.  Stress Detection in Computer Users Based on Digital Signal Processing of Noninvasive Physiological Variables , 2006, 2006 International Conference of the IEEE Engineering in Medicine and Biology Society.

[41]  Robert D. Ward,et al.  Physiological responses to different WEB page designs , 2003, Int. J. Hum. Comput. Stud..

[42]  Jonathan Klein,et al.  Computers that recognise and respond to user emotion: theoretical and practical implications , 2002, Interact. Comput..

[43]  P. Ekman An argument for basic emotions , 1992 .

[44]  Nuria Oliver,et al.  A Refined Experience Sampling Method to Capture Mobile User Experience , 2009, ArXiv.

[45]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[46]  Christine L. Lisetti,et al.  Emotion recognition from physiological signals using wireless sensors for presence technologies , 2004, Cognition, Technology & Work.

[47]  N Berthouze Body movement as a means to modulate engagement in computer games , 2008 .

[48]  Kristina Höök,et al.  The sensual evaluation instrument: developing an affective evaluation tool , 2006, CHI.

[49]  John J. B. Allen,et al.  The handbook of emotion elicitation and assessment , 2007 .

[50]  David Watson,et al.  The PANAS-X manual for the positive and negative affect schedule , 1994 .

[51]  Regan L. Mandryk,et al.  Using psychophysiological techniques to measure user experience with entertainment technologies , 2006, Behav. Inf. Technol..

[52]  R. A. Mcfarland Relationship of skin temperature changes to the emotions accompanying music , 1985, Biofeedback and self-regulation.

[53]  Russell Beale,et al.  Affect and Emotion in Human-Computer Interaction, From Theory to Applications , 2008, Affect and Emotion in Human-Computer Interaction.

[54]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[55]  Suresh Manandhar,et al.  Argumentative Human Computer Dialogue for Automated Persuasion , 2008, SIGDIAL Workshop.

[56]  Noam Tractinsky,et al.  Tools over solutions? Comments on Interacting with Computers special issue on affective computing , 2004, Interact. Comput..

[57]  C. Nass,et al.  Emotion in human-computer interaction , 2002 .

[58]  J. Cacioppo,et al.  Inferring psychological significance from physiological signals. , 1990, The American psychologist.

[59]  Wendy S. Ark,et al.  The Emotion Mouse , 1999, HCI.

[60]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[61]  Dhaval Vyas,et al.  Rich evaluations of entertainment experience: bridging the interpretational gap , 2006, ECCE '06.

[62]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[63]  Alex Pentland,et al.  A vision system for observing and extracting facial action parameters , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[64]  J. Cacioppo,et al.  Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. , 1986, Journal of personality and social psychology.

[65]  M Steffen,et al.  Mobile Noncontact Monitoring of Heart and Lung Activity , 2007, IEEE Transactions on Biomedical Circuits and Systems.

[66]  Marc Hassenzahl,et al.  The Effect of Perceived Hedonic Quality on Product Appealingness , 2001, Int. J. Hum. Comput. Interact..

[67]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[68]  Christine L. Lisetti,et al.  A User-Modeling Approach to Build User's Psycho-Physiological Maps of Emotions using Bio-Sensors , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[69]  T. D. Clark,et al.  An ultra-low-noise electrical-potential probe for human-body scanning , 2000 .

[70]  Virpi Roto,et al.  Towards Practical User Experience Evaluation Methods , 2008 .

[71]  K. Hoffmann Biosignale erfassen und verarbeiten , 2011 .

[72]  Jason Williams,et al.  Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System , 2004, ADS.

[73]  Tao Luo,et al.  Integrating Web Usage and Content Mining for More Effective Personalization , 2000, EC-Web.

[74]  Masaki Shuzo,et al.  Shoji: A Communication Terminal for Sending and Receiving Ambient Information , 2009 .

[75]  Eric Paulos Connexus: a communal interface , 2003, DUX '03.

[76]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[77]  Susanne Sperring,et al.  Viewers' Experiences of a TV Quiz Show with Integrated Interactivity , 2008, Int. J. Hum. Comput. Interact..

[78]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[79]  Laura K. Guerrero,et al.  Handbook of communication and emotion : research, theory, applications, and contexts , 1998 .

[80]  J. J. Jacobs,et al.  When a Car Makes You Smile: Development and Application of an Instrument to Measure Product Emotions , 2000 .

[81]  Sylvia D. Kreibig,et al.  Cardiovascular, electrodermal, and respiratory response patterns to fear- and sadness-inducing films. , 2007, Psychophysiology.

[82]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[83]  P. Zimmermann,et al.  Affective Computing—A Rationale for Measuring Mood With Mouse and Keyboard , 2003, International journal of occupational safety and ergonomics : JOSE.

[84]  Foster J. Provost,et al.  Handling Missing Values when Applying Classification Models , 2007, J. Mach. Learn. Res..

[85]  Valeria Carofiglio,et al.  Portia: A User-Adapted Persuasion System in the Healthy-Eating Domain , 2007, IEEE Intelligent Systems.

[86]  Christos D. Katsis,et al.  Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach , 2008, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[87]  Christine L. Lisetti,et al.  Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals , 2004, EURASIP J. Adv. Signal Process..

[88]  Egon L. van den Broek,et al.  Unobtrusive Sensing of Emotions (USE) , 2009, J. Ambient Intell. Smart Environ..

[89]  Veikko Surakka,et al.  Emotions and heart rate while sitting on a chair , 2005, CHI.