Affective Computing Out-of-The-Lab: The Cost of Low Cost

Affective computing, with its potential to enhance human-computer interaction, is experiencing an expansion of its use in many areas such as health care and the gaming industry. One obstacle to its widespread adoption can be the high cost requirement for biofeedback. Indeed, typical laboratory setups are often expensive which makes them out of reach for many. This paper explores lower-cost alternatives to expensive laboratory solutions. Data from several recent studies totaling over 200 hours of physiological recordings are leveraged to compare high-end solutions to a lower cost one. Heart rate, electrodermal activity, facial action units, head movement, and eye movement – five of the most used bio-behavioural signals – have their respective higher and lower cost sensors compared. The resulting comparison illustrates that lower-cost solutions are not drop-in replacements. While a correlation of 0.62 between electrodermal activity readings was found, notable differences between reported heart rate readings over small timescales were also observed. Head tracking recordings shared similarity (0.51), but eye tracking did not (0.18). As for facial action units recognition, only those linked to smiling had significant correlation (around 0.48). These results should broaden the range of contexts in which biofeedback could be exploited. This aim may be fulfilled by informing the reader of the extent of lower cost solution applications.

[1]  Georgios N. Yannakakis,et al.  Psychophysiology in Games , 2016 .

[2]  Himanshu Thapliyal,et al.  A Survey of Affective Computing for Stress Detection: Evaluating technologies in stress detection for better health , 2016, IEEE Consumer Electronics Magazine.

[3]  Ion Iriarte,et al.  Next generation of tools for industry to evaluate the user emotional perception: the biometric-based multimethod tools , 2017 .

[4]  N. Enewoldsen Analysis of the quality of electrodermal activity and heart rate data recorded in daily life over a period of one week with an E4 wristband , 2016 .

[5]  Julian Togelius,et al.  Modeling player experience in Super Mario Bros , 2009, 2009 IEEE Symposium on Computational Intelligence and Games.

[6]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[7]  Isabela Granic,et al.  Designing and Utilizing Biofeedback Games for Emotion Regulation: The Case of Nevermind , 2016, CHI Extended Abstracts.

[8]  Eric T. Greenlee,et al.  Which Eye Tracker Is Right for Your Research? Performance Evaluation of Several Cost Variant Eye Trackers , 2016 .

[9]  Ilia Uma physiological signals based human emotion recognition a review , 2014 .

[10]  Egon Werlen,et al.  A comparison of students' emotional self-reports with automated facial emotion recognition in a reading situation , 2018, TEEM.

[11]  Alexandre Campeau-Lecours,et al.  FUNii: The Physio-Behavioural Adaptive Video Game , 2019, HCI.

[12]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[13]  Tim M. den Uyl,et al.  Automated facial coding: validation of basic emotions and FACS AUs in FaceReader , 2014 .

[14]  Alexandre Campeau-Lecours,et al.  Predicting Video Game Players’ Fun from Physiological and Behavioural Data , 2018, Advances in Intelligent Systems and Computing.

[15]  Regan L. Mandryk,et al.  Biofeedback game design: using direct and indirect physiological control to enhance game interaction , 2011, CHI.

[16]  Nico Pallamin,et al.  Emotion Recognition Using Physiological Signals: Laboratory vs. Wearable Sensors , 2017, AHFE.

[17]  Luca Citi,et al.  cvxEDA: A Convex Optimization Approach to Electrodermal Activity Processing , 2016, IEEE Transactions on Biomedical Engineering.

[18]  Andy Adler,et al.  Validation of the Empatica E4 wristband , 2016, 2016 IEEE EMBS International Student Conference (ISC).

[19]  W. Ahn,et al.  Using Computer-vision and Machine Learning to Automate Facial Coding of Positive and Negative Affect Intensity , 2018, bioRxiv.