In many affective computing paradigms a user’s internal state is used as an implicit control signal in an interaction. In the work presented here, we are exploring the utilization of two measurement techniques commonly used to assess a user’s affective state as an explicit control signal in a navigation task in a virtual environment. Concretely, we are investigating the feasibility of combining a real-time emotional biometric sensing system and a computer vision system for human emotional characterization and controlling a computer game. A user’s “happiness” and “sadness” levels are assessed by combining information from a camerabased computer vision system and electromyogram (EMG) signals from the facial corrugator muscle. Using a purpose-designed 3D flight simulation game, users control their simulated up-down motions using their facial expressions. To assess if combining visual and EMG data improves facial tracking performance, we conduct a user study where users are navigating through the 3D visual environment using the two control systems, trying to collect as many tokens as possible. We compared two conditions: Computer vision system alone, and computer vision system in combination with the EMG signal. The results show that combining both signals significantly increases the users’ performance and reduces task difficulty. However, this performance increase is associated with a reduced usability due to the need to have EMG sensors on one’s forehead. We hope these results from our study can help in future game designs, aid the development of more immersive virtual environments, and offer for alternative input methods where traditional methods are insufficient or unfesasible.
[1]
Jussi Holopainen,et al.
The Psychophysiology of Video Gaming: Phasic Emotional Responses to Game Events
,
2005,
DiGRA Conference.
[2]
Allan Hanbury,et al.
Affective image classification using features inspired by psychology and art theory
,
2010,
ACM Multimedia.
[3]
J. Gross,et al.
Cognitive reappraisal of negative affect: converging evidence from EMG and self-report.
,
2010,
Emotion.
[4]
B. Cuthbert,et al.
Committee report: Guidelines for human startle eyeblink electromyographic studies.
,
2005,
Psychophysiology.
[5]
Pierre Philippot,et al.
Facial Reactions to Emotional Facial Expressions: Affect or Cognition?
,
1998
.
[6]
J. Schulkin,et al.
Facial expressions of emotion: A cognitive neuroscience perspective
,
2003,
Brain and Cognition.
[7]
S. Puttonen,et al.
The psychophysiology of James Bond: phasic emotional responses to violent video game events.
,
2008,
Emotion.
[8]
Magy Seif El-Nasr,et al.
A Scientific Look at the Design of Aesthetically and Emotionally Engaging Interactive Entertainment Experiences
,
2009
.
[9]
Pieter Desmet,et al.
Measuring Emotion: Development and Application of an Instrument to Measure Emotional Responses to Products
,
2005,
Funology.
[10]
Regan L. Mandryk,et al.
A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies
,
2007,
Int. J. Hum. Comput. Stud..