SubjectBook: Hypothesis-Driven Ubiquitous Visualization for Affective Studies

Analyzing affective studies is challenging because they feature multimodal data, such as psychometric scores, imaging sequences, and signals from wearable sensors, with the latter streaming continuously for hours on end. Meaningful visual representations of such data can greatly facilitate insights and qualitative analysis. Various tools that were proposed to tackle this problem provide visualizations of the original data only; they do not support higher level abstractions. In this paper, we introduce SubjectBook, an interactive web-based tool for synchronizing, visualizing, exploring, and analyzing affective datasets. Uniquely, SubjectBook operates at three levels of abstraction, mirroring the stages of quantitative analysis in hypothesis-driven research. The top level uses a grid visualization to show the study's significant outcomes across subjects. The middle level summarizes, for each subject, context information along with the explanatory and response measurements in a construct reminiscent of an ID card. This enables the analyst to appreciate within subject phenomena. Finally, the bottom level brings together detailed information concerning the inner and outer state of human subjects along with their real-world interactions - a visualization fusion that supports cause and effect reasoning at the experimental session level. SubjectBook was evaluated on a case study focused on driving behaviors.

[1]  Karrie Karahalios,et al.  BEDA : Visual analytics for behavioral and physiological data , 2013 .

[2]  Dvijesh Shastri,et al.  Interfacing information in affective user studies , 2014, UbiComp Adjunct.

[3]  I. Pavlidis,et al.  Fast by Nature - How Stress Patterns Define Human Experience and Performance in Dexterous Tasks , 2012, Scientific Reports.

[4]  Nadir Weibel,et al.  ChronoViz: a system for supporting navigation of time-coded data , 2011, CHI Extended Abstracts.

[5]  M. Oefinger,et al.  An interactive Web-based tool for multiscale physiological data visualization , 2004, Computers in Cardiology, 2004.

[6]  Avinash Wesley,et al.  A novel method to monitor driver's distractions , 2010, CHI EA '10.

[7]  Steven Pemberton,et al.  CHI '97 Extended Abstracts on Human Factors in Computing Systems , 1997, CHI 1997.

[8]  Maja Pantic,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .

[9]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[10]  Rosalind W. Picard,et al.  A Wearable Sensor for Unobtrusive, Long-Term Assessment of Electrodermal Activity , 2010, IEEE Transactions on Biomedical Engineering.

[11]  H. Simon,et al.  Invariants of human behavior. , 1990, Annual review of psychology.

[12]  Peter Wittenburg,et al.  ELAN: a Professional Framework for Multimodality Research , 2006, LREC.

[13]  Peter Robinson,et al.  3D Corpus of Spontaneous Complex Mental States , 2011, ACII.

[14]  Fabien Ringeval,et al.  Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).