UXmood - A Tool to Investigate the User Experience (UX) Based on Multimodal Sentiment Analysis and Information Visualization (InfoVis)

Evaluating User Experience (UX) is not a trivial task, and UX specialists have used a variety of tools to ana- lyze data collected from user tests, which causes difficulty in synchronizing the data. This paper presents UXmood, a tool that condenses multiple distinct data types (audio, video, text, and eye-tracking) in a dashboard of coordinated visualizations to ease the analysis process and allow to manage several projects where each project has several logs of user interaction. The tool replays sessions of tests and uses a combination of different sentiment analysis techniques to present a suggestion of user sentiment at any given time during the tasks. The visualizations support brushing and details-on-demand interactions and are synchronized with a temporal slider, allowing analysts to see specific moments of the tests freely. Also, the uses of the sentiment analysis in the collected data may improved the qualitative analysis of UX.

[1]  Bing Liu,et al.  Sentiment Analysis and Subjectivity , 2010, Handbook of Natural Language Processing.

[2]  Steven Lawrence Fernandes,et al.  Extraction of emotions from multilingual text using intelligent text processing and computational linguistics , 2017, J. Comput. Sci..

[3]  Albert Mehrabian,et al.  Encoding of attitude by a seated communicator via posture and position cues. , 1969 .

[4]  Rossitza Setchi,et al.  Exploring User Experience with Image Schemas, Sentiments, and Semantics , 2019, IEEE Transactions on Affective Computing.

[5]  Effie Lai-Chong Law,et al.  The measurability and predictability of user experience , 2011, EICS '11.

[6]  Ana Figueiras A Review of Visualization Assessment in Terms of User Performance and Experience , 2018, 2018 22nd International Conference Information Visualisation (IV).

[7]  Wen-Jing Yan,et al.  How Fast are the Leaked Facial Expressions: The Duration of Micro-Expressions , 2013 .

[8]  Matias Valdenegro-Toro,et al.  Real-time Convolutional Neural Networks for emotion and gender classification , 2017, ESANN.

[9]  Bo Fu,et al.  Eye tracking the user experience - An evaluation of ontology visualization techniques , 2016, Semantic Web.

[10]  Andrea Esuli,et al.  SentiWordNet 3.0: An Enhanced Lexical Resource for Sentiment Analysis and Opinion Mining , 2010, LREC.

[11]  Barbara Kieslinger,et al.  User Experiences Around Sentiment Analyses, Facilitating Workplace Learning , 2017, HCI.

[12]  Vincent Koenig,et al.  User experience: A concept without consensus? Exploring practitioners' perspectives through an international survey , 2015, Comput. Hum. Behav..

[13]  Oh-Wook Kwon,et al.  EMOTION RECOGNITION BY SPEECH SIGNAL , 2003 .

[14]  Sungyoung Lee,et al.  A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation , 2018, Sensors.

[15]  Emad Barsoum,et al.  Training deep networks for facial expression recognition with crowd-sourced label distribution , 2016, ICMI.