A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation

The user experience (UX) is an emerging field in user research and design, and the development of UX evaluation methods presents a challenge for both researchers and practitioners. Different UX evaluation methods have been developed to extract accurate UX data. Among UX evaluation methods, the mixed-method approach of triangulation has gained importance. It provides more accurate and precise information about the user while interacting with the product. However, this approach requires skilled UX researchers and developers to integrate multiple devices, synchronize them, analyze the data, and ultimately produce an informed decision. In this paper, a method and system for measuring the overall UX over time using a triangulation method are proposed. The proposed platform incorporates observational and physiological measurements in addition to traditional ones. The platform reduces the subjective bias and validates the user’s perceptions, which are measured by different sensors through objectification of the subjective nature of the user in the UX assessment. The platform additionally offers plug-and-play support for different devices and powerful analytics for obtaining insight on the UX in terms of multiple participants.

[1]  Sungyoung Lee,et al.  Adaptive User Interface and User Experience Based Authoring Tool for Recommendation Systems , 2014, UCAmI.

[2]  S. Vrana,et al.  The psychophysiology of disgust: differentiating negative emotional contexts with facial EMG. , 1993, Psychophysiology.

[3]  Kah Phooi Seng,et al.  A new approach of audio emotion recognition , 2014, Expert Syst. Appl..

[4]  Frank Elberzhager,et al.  An Automated Feedback-Based Approach to Support Mobile App Development , 2017, 2017 43rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA).

[5]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[6]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[7]  Noam Tractinsky,et al.  Assessing dimensions of perceived visual aesthetics of web sites , 2004 .

[8]  Hilla Peretz,et al.  The , 1966 .

[9]  Amol S. Patwardhan,et al.  Multimodal mixed emotion detection , 2017, 2017 2nd International Conference on Communication and Electronics Systems (ICCES).

[10]  N. Bolger,et al.  Diary methods: capturing life as it is lived. , 2003, Annual review of psychology.

[11]  Erik Cambria,et al.  Towards an intelligent framework for multimodal affective data analysis , 2015, Neural Networks.

[12]  M. den Uyl,et al.  The FaceReader: Online facial expression recognition , 2006 .

[13]  K. Scherer,et al.  Geneva Emotion Wheel Rating Study , 2012 .

[14]  Chen-Fu Chien,et al.  UNISON framework of data-driven innovation for extracting user experience of product design of wearable devices , 2016, Comput. Ind. Eng..

[15]  J Tonndorf,et al.  Cochlear prostheses. A state-of-the-art review. , 1977, The Annals of otology, rhinology & laryngology. Supplement.

[16]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[17]  Beatriz Plaza,et al.  Google analytics for measuring website performance. , 2011 .

[18]  F. Paas,et al.  Uncovering the problem-solving process: cued retrospective reporting versus concurrent and retrospective reporting. , 2005, Journal of experimental psychology. Applied.

[19]  Fakhri Karray,et al.  Survey on speech emotion recognition: Features, classification schemes, and databases , 2011, Pattern Recognit..

[20]  Bao-Liang Lu,et al.  Identifying Stable Patterns over Time for Emotion Recognition from EEG , 2016, IEEE Transactions on Affective Computing.

[21]  Jeffrey M. Girard,et al.  A Primer on Observational Measurement , 2016, Assessment.

[22]  Himanshu Thapliyal,et al.  A Survey of Affective Computing for Stress Detection: Evaluating technologies in stress detection for better health , 2016, IEEE Consumer Electronics Magazine.

[23]  Sungyong Lee,et al.  The Mining Minds digital health and wellness framework , 2016, Biomedical engineering online.

[24]  Marian Stewart Bartlett,et al.  Automatic facial expression recognition for intelligent tutoring systems , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[25]  Sungyoung Lee,et al.  Identifying user experience (UX) dimensions from UX literature reviews , 2016 .

[26]  John Zimmerman,et al.  User experience over time: an initial framework , 2009, CHI.

[27]  Tauhid Zaman,et al.  Predicting Performance Under Stressful Conditions Using Galvanic Skin Response , 2016, ArXiv.

[28]  Filippo Sanfilippo,et al.  A multi-sensor fusion framework for improving situational awareness in demanding maritime training , 2017, Reliab. Eng. Syst. Saf..

[29]  Chandrima Sarkar,et al.  Feature Analysis for Computational Personality Recognition Using YouTube Personality Data set , 2014, WCPR '14.

[30]  Virpi Roto,et al.  USER EXPERIENCE WHITE PAPER Bringing clarity to the concept of user experience , 2011 .

[31]  Robert K. Atkinson,et al.  Assessing User Experience via Biometric Sensor Affect Detection , 2017, Human Performance Technology.

[32]  Qiang Ji,et al.  Hybrid video emotional tagging using users’ EEG and video content , 2014, Multimedia Tools and Applications.

[33]  Virpi Roto,et al.  User experience evaluation methods: current state and development needs , 2010, NordiCHI.

[34]  Jianhui Chen,et al.  Channel Division Based Multiple Classifiers Fusion for Emotion Recognition Using EEG signals , 2017 .

[35]  Maja Pantic,et al.  The SEMAINE corpus of emotionally coloured character interactions , 2010, 2010 IEEE International Conference on Multimedia and Expo.

[36]  Mike Kuniavsky,et al.  Observing the User Experience: A Practitioner's Guide to User Research (Morgan Kaufmann Series in Interactive Technologies) (The Morgan Kaufmann Series in Interactive Technologies) , 2003 .

[37]  Vitomir Štruc,et al.  Towards Efficient Multi-Modal Emotion Recognition , 2013 .

[38]  Han Tong Loh,et al.  Exploring online reviews for user experience modeling , 2013 .

[39]  Yang Wang,et al.  Detecting Users’ Cognitive Load by Galvanic Skin Response with Affective Interference , 2017, ACM Trans. Interact. Intell. Syst..

[40]  Sungyoung Lee,et al.  On Curating Multimodal Sensory Data for Health and Wellness Platforms , 2016, Sensors.

[41]  E. Diener,et al.  Experience Sampling: Promises and Pitfalls, Strengths and Weaknesses , 2003 .

[42]  Beatrice Santorini,et al.  The Penn Treebank: An Overview , 2003 .

[43]  Gonzalo Pajares,et al.  Use of mobile network analytics for application performance design , 2017, 2017 Network Traffic Measurement and Analysis Conference (TMA).

[44]  Arnab Bag,et al.  Effects of emotion on physiological signals , 2016, 2016 IEEE Annual India Conference (INDICON).

[45]  Sungyoung Lee,et al.  Model-based adaptive user interface based on context and user experience evaluation , 2018, Journal on Multimodal User Interfaces.

[46]  Choong Seon Hong,et al.  Human Behavior Analysis by Means of Multimodal Context Mining , 2016, Sensors.

[47]  Dimitrios Ververidis,et al.  A State of the Art Review on Emotional Speech Databases , 2003 .

[48]  Bao-Liang Lu,et al.  Multimodal emotion recognition using EEG and eye tracking data , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[49]  Emery Schubert Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space , 1999 .

[50]  Bo Fu,et al.  Eye tracking the user experience - An evaluation of ontology visualization techniques , 2016, Semantic Web.

[51]  Paul van Schaik,et al.  Modelling user experience - An agenda for research and practice , 2010, Interact. Comput..

[52]  A. Manstead,et al.  Can Duchenne smiles be feigned? New evidence on felt and false smiles. , 2009, Emotion.

[53]  Markus Voelter,et al.  State of the Art , 1997, Pediatric Research.

[54]  Erik Cambria,et al.  Fusing audio, visual and textual clues for sentiment analysis from multimodal content , 2016, Neurocomputing.

[55]  Brian Clifton,et al.  Advanced Web Metrics with Google Analytics , 2008 .

[56]  Joseph S. Dumas,et al.  Comparison of three one-question, post-task usability questionnaires , 2009, CHI.

[57]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[58]  Dinko Bacic,et al.  Understanding Business Dashboard Design User Impact: Triangulation Approach Using Eye-tracking, Facial Expression, Galvanic Skin Response and EEG Sensors , 2017, AMCIS.

[59]  Björn W. Schuller,et al.  YouTube Movie Reviews: Sentiment Analysis in an Audio-Visual Context , 2013, IEEE Intelligent Systems.

[60]  T. Jick Mixing Qualitative and Quantitative Methods: Triangulation in Action. , 1979 .

[61]  Manfred Tscheligi,et al.  Characteristics of narrative textual data linked to user experiences , 2014, CHI Extended Abstracts.

[62]  Wan Fatimah Wan Ahmad,et al.  Benefits of Complementing Eye-Tracking Analysis with Think-Aloud Protocol in a Multilingual Country with High Power Distance , 2014 .

[63]  Bieke Zaman,et al.  The FaceReader: measuring instant fun of use , 2006, NordiCHI '06.

[64]  Pieter Desmet,et al.  Measuring Emotion: Development and Application of an Instrument to Measure Emotional Responses to Products , 2005, Funology.

[65]  Xiaoling Yang,et al.  Comparative Study on Voice Activity Detection Algorithm , 2010, 2010 International Conference on Electrical and Control Engineering.

[66]  Mary Corbett,et al.  SUMI: the Software Usability Measurement Inventory , 1993, Br. J. Educ. Technol..

[67]  Jennifer C. Romano Bergstrom,et al.  Eye tracking in user experience design , 2014 .

[68]  Aga Bojko,et al.  Eye Tracking the User Experience: A Practical Guide to Research , 2013 .

[69]  Hyunseung Choo,et al.  A Novel Maximum Entropy Markov Model for Human Facial Expression Recognition , 2016, PloS one.

[70]  Sungyoung Lee,et al.  Multimodal hybrid reasoning methodology for personalized wellbeing services , 2016, Comput. Biol. Medicine.

[71]  Pieter Desmet,et al.  Designing Products with Added Emotional Value: Development and Appllcation of an Approach for Research through Design , 2001 .

[72]  Vincent G. Duffy,et al.  User Experience Design Based on Eye-Tracking Technology: A Case Study on Smartphone APPs , 2017 .

[73]  Sergio Escalera,et al.  Audio-Visual Emotion Recognition in Video Clips , 2019, IEEE Transactions on Affective Computing.

[74]  Rui Xia,et al.  Ensemble of feature sets and classification algorithms for sentiment classification , 2011, Inf. Sci..

[75]  Nasrollah Moghaddam Charkari,et al.  Multimodal information fusion application to human emotion recognition from face and speech , 2010, Multimedia Tools and Applications.

[76]  Yong-Jin Liu,et al.  Real-Time Movie-Induced Discrete Emotion Recognition from EEG Signals , 2018, IEEE Transactions on Affective Computing.

[77]  Marc Hassenzahl,et al.  User experience - a research agenda , 2006, Behav. Inf. Technol..

[78]  Roliana Ibrahim,et al.  Ordinal-based and frequency-based integration of feature selection methods for sentiment analysis , 2017, Expert Syst. Appl..

[79]  Katerina Tzafilkou,et al.  Diagnosing user perception and acceptance using eye tracking in web-based end-user development , 2017, Comput. Hum. Behav..

[80]  Sencun Zhu,et al.  Alde: Privacy Risk Analysis of Analytics Libraries in the Android Ecosystem , 2016, SecureComm.

[81]  Daniel Fallman,et al.  Dealing with User Experience and Affective Evaluation in HCI Design : A Repertory Grid Approach , 2005, CHI 2005.

[82]  David J. Kriegman,et al.  Acquiring linear subspaces for face recognition under variable lighting , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[83]  Martin Schrepp,et al.  Construction and Evaluation of a User Experience Questionnaire , 2008, USAB.

[84]  Fred G. W. C. Paas,et al.  The Efficiency of Instructional Conditions: An Approach to Combine Mental Effort and Performance Measures , 1992 .

[85]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.