Assessing User Experience via Biometric Sensor Affect Detection

Traditional user experience assessments rely on self-report, human-system performance, and observational data that incompletely capture users’ psychological demands, processing, or affect. Specifically, self-report measures require users to identify and articulate subjective responses to product features, yet users may not possess accurate awareness or may be unwilling or unable to express themselves. Similarly, human-system performance and observational measures require analysts to make inferences about hidden psychological states based on observed external patterns. This chapter discusses how biometric sensor-based affect detection technologies (e.g., eye tracking and EEG) may supplement traditional methods. By measuring biometric indicators of psychological states, researchers can gain potentially richer and more accurate insights into user experience. These technologies are gaining traction in educational technology development and functionality, and thus the extension of these tools for usability and user experience evaluation is highly feasible.

[1]  Rosalind W. Picard Affective computing: (526112012-054) , 1997 .

[2]  Bernd Faust Advanced Topics In End User Computing , 2016 .

[3]  William B. Lober,et al.  Development and usability testing of a web-based cancer symptom and quality-of-life support intervention , 2015, Health Informatics J..

[4]  Avelino J. Gonzalez,et al.  Modelling a learner's affective state in real time to improve intelligent tutoring effectiveness , 2016 .

[5]  Neil F. Doherty,et al.  The role of user ownership and positive user attitudes in the successful adoption of information systems within NHS community trusts , 2002 .

[6]  Nicolas Vibert,et al.  Impact of the motion and visual complexity of the background on players' performance in video game-like displays , 2013, Ergonomics.

[7]  Juan D. Velásquez,et al.  Combining eye tracking and pupillary dilation analysis to identify Website Key Objects , 2015, Neurocomputing.

[8]  Arthur C. Graesser,et al.  Better to be frustrated than bored: The incidence, persistence, and impact of learners' cognitive-affective states during interactions with three different computer-based learning environments , 2010, Int. J. Hum. Comput. Stud..

[9]  Robin S. Poston,et al.  Knowledge Management Systems Usage: Rating Scheme Validity and the Effort-Accuracy Trade-Off , 2008, J. Organ. End User Comput..

[10]  Michelle N. Lumicao,et al.  EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. , 2007, Aviation, space, and environmental medicine.

[11]  P. Foglia,et al.  Relating GSR Signals to traditional Usability Metrics: Case Study with an anthropomorphic Web Assistant , 2008, 2008 IEEE Instrumentation and Measurement Technology Conference.

[12]  Mehmet Göktürk,et al.  Psychophysiological measures of human cognitive states applied in human computer interaction , 2011, WCIT.

[13]  Christine L. Lisetti,et al.  Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals , 2004, EURASIP J. Adv. Signal Process..

[14]  Katerina Tzafilkou,et al.  Diagnosing user perception and acceptance using eye tracking in web-based end-user development , 2017, Comput. Hum. Behav..

[15]  Michalis Nik Xenos,et al.  Stress in interactive applications: analysis of the valence-arousal space based on physiological signals and self-reported data , 2016, Multimedia Tools and Applications.

[16]  Georg Weichhart,et al.  Traceable Pedagogical Design Rationales for Personalized Learning Technologies: An Interoperable System-to-System Approach , 2014, Int. J. People Oriented Program..

[17]  T. Gog,et al.  In the eyes of the beholder: How experts and novices interpret dynamic stimuli , 2010 .

[18]  Marian Stewart Bartlett,et al.  Automatic facial expression recognition for intelligent tutoring systems , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[19]  James C. Lester,et al.  Modeling self-efficacy in intelligent tutoring systems: An inductive approach , 2008, User Modeling and User-Adapted Interaction.

[20]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[21]  Anne C. Frenzel,et al.  Measuring emotions in students learning and performance: The Achievement Emotions Questionnaire (AE , 2011 .

[22]  Murray E. Jennex,et al.  End-User System Development: Lessons from a Case Study of IT Usage in an Engineering Organization , 2005, J. Cases Inf. Technol..

[23]  Lei Zhou,et al.  Using Physiological Measures to Evaluate User Experience of Mobile Applications , 2014, HCI.

[24]  Cristina Hava Muntean,et al.  A Novel Sensor-Based Methodology for Learner's Motivation Analysis in Game-Based Learning , 2014, Interact. Comput..

[25]  Menno D. T. de Jong,et al.  Retrospective vs. concurrent think-aloud protocols: Testing the usability of an online library catalogue , 2003, Behav. Inf. Technol..

[26]  Scotty D. Craig,et al.  Affect and learning: An exploratory look into the role of affect in learning with AutoTutor , 2004 .

[27]  J. Cohn,et al.  Automated Face Analysis for Affective Computing , 2015 .

[28]  Dirk Heylen,et al.  Affective brain-computer interfaces: neuroscientific approaches to affect detection , 2015 .

[29]  Rafael A. Calvo,et al.  Frontiers of Affect-Aware Learning Technologies , 2012, IEEE Intelligent Systems.

[30]  M. Teplan FUNDAMENTALS OF EEG MEASUREMENT , 2002 .

[31]  N. Bolger,et al.  Diary methods: capturing life as it is lived. , 2003, Annual review of psychology.

[32]  Marian Stewart Bartlett,et al.  Towards an Optimal Affect-Sensitive Instructional System of cognitive skills , 2011, CVPR 2011 WORKSHOPS.

[33]  Zhiwei Guan,et al.  The validity of the stimulated retrospective think-aloud method as measured by eye tracking , 2006, CHI.

[34]  Christian Peter,et al.  Emotion representation and physiology assignments in digital systems , 2006, Interact. Comput..

[35]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[36]  Sidney K. D'Mello,et al.  It's Written on Your Face: Detecting Affective States from Facial Expressions while Learning Computer Programming , 2014, Intelligent Tutoring Systems.

[37]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[38]  Pejman Mirza-Babaei,et al.  Understanding the Contribution of Biometrics to Games User Research , 2011, DiGRA Conference.

[39]  Scott B. MacKenzie,et al.  Common method biases in behavioral research: a critical review of the literature and recommended remedies. , 2003, The Journal of applied psychology.

[40]  J. Kruger,et al.  Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. , 1999, Journal of personality and social psychology.

[41]  Bruce Phillips,et al.  Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems , 2008, CHI.

[42]  M A Just,et al.  A theory of reading: from eye fixations to comprehension. , 1980, Psychological review.

[43]  F. Paas,et al.  Uncovering the problem-solving process: cued retrospective reporting versus concurrent and retrospective reporting. , 2005, Journal of experimental psychology. Applied.

[44]  Wan Fatimah Wan Ahmad,et al.  Benefits of Complementing Eye-Tracking Analysis with Think-Aloud Protocol in a Multilingual Country with High Power Distance , 2014 .

[45]  Kristy Elizabeth Boyer,et al.  Automatically Recognizing Facial Expression: Predicting Engagement and Frustration , 2013, EDM.

[46]  Tao Lin,et al.  Do physiological data relate to traditional usability indexes? , 2005, OZCHI.

[47]  P. Ekman,et al.  Autonomic nervous system activity distinguishes among emotions. , 1983, Science.

[48]  S. Boesveldt,et al.  The relation between continuous and discrete emotional responses to food odors with facial expressions and non-verbal reports , 2016 .

[49]  Thierry Baccino,et al.  UX Heatmaps: Mapping User Experience on Visual Interfaces , 2016, CHI.

[50]  Kyparisia A. Papanikolaou,et al.  Building an Instructional Framework to Support Learner Control in Adaptive Educational Systems , 2008 .

[51]  Dongsong Zhang,et al.  Challenges, Methodologies, and Issues in the Usability Testing of Mobile Applications , 2005, Int. J. Hum. Comput. Interact..

[52]  Regan L. Mandryk,et al.  Using psychophysiological techniques to measure user experience with entertainment technologies , 2006, Behav. Inf. Technol..

[53]  Ken-ichi Matsumoto,et al.  A Quantitative Evaluation on the Software Use Experience with Electroencephalogram , 2011, HCI.

[54]  Giulio Jacucci,et al.  The Psychophysiology Primer: A Guide to Methods and a Broad Review with a Focus on Human-Computer Interaction , 2016, Found. Trends Hum. Comput. Interact..

[55]  W. Boucsein Electrodermal activity, 2nd ed. , 2012 .

[56]  Huei-Tse Hou,et al.  Visual attention for solving multiple-choice science problem: An eye-tracking analysis , 2012, Comput. Educ..

[57]  Eric Knauss,et al.  Structured and unobtrusive observation of anonymous users and their context for requirements elicitation , 2011, 2011 IEEE 19th International Requirements Engineering Conference.

[58]  Beverly Park Woolf,et al.  Affect-aware tutors: recognising and responding to student affect , 2009, Int. J. Learn. Technol..

[59]  Begoña Garcia-Zapirain,et al.  EEG artifact removal—state-of-the-art and guidelines , 2015, Journal of neural engineering.

[60]  Virpi Roto,et al.  User experience evaluation methods: current state and development needs , 2010, NordiCHI.

[61]  Jason M. Harley,et al.  A multi-componential analysis of emotions during complex learning with an intelligent multi-agent system , 2014, Comput. Hum. Behav..

[62]  Cristina Conati,et al.  Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation , 2007, Knowl. Based Syst..

[63]  Daniel M. Russell,et al.  Discriminating the relevance of web search results with measures of pupil size , 2009, CHI.

[64]  Cristina Conati,et al.  Comparing and Combining Eye Gaze and Interface Actions for Determining User Learning with an Interactive Simulation , 2013, UMAP.

[65]  R. Krauss,et al.  Facial and autonomic manifestations of the dimensional structure of emotion , 1984 .