Multimodal Sensing in Affective Gaming

A typical gaming scenario, as developed in the past 20 years, involves a player interacting with a game using a specialized input device, such as a joystick, a mouse, a keyboard or a proprietary game controller. Recent technological advances have enabled the introduction of more elaborated approaches in which the player is able to interact with the game using body pose, facial expressions, actions, even physiological signals. The future lies in ‘affective gaming’, that is games that will be ‘intelligent’ enough not only to extract the player’s commands provided by speech and gestures, but also to extract behavioural cues, as well as emotional states and adjust the game narrative accordingly, in order to ensure more realistic and satisfactory player experience. In this chapter, we review the area of affective gaming by describing existing approaches and discussing recent technological advances. More precisely, we first elaborate on different sources of affect information in games and proceed with issues such as the affective evaluation of players and affective interaction in games. We summarize the existing commercial affective gaming applications and introduce new gaming scenarios. We outline some of the most important problems that have to be tackled in order to create more realistic and efficient interactions between players and games and conclude by highlighting the challenges such systems must overcome.

[1]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[2]  Nadia Bianchi-Berthouze,et al.  Automatic Recognition of Affective Body Movement in a Video Game Scenario , 2011, INTETAIN.

[3]  Joost Broekens,et al.  Foundations for modelling emotions in game characters: Modelling emotion effects on cognition , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[4]  Nadia Bianchi-Berthouze,et al.  Modeling human affective postures: an information theoretic characterization of posture features , 2004, Comput. Animat. Virtual Worlds.

[5]  Hatice Gunes,et al.  Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[6]  Michael J. Lyons,et al.  Automatic Classification of Single Facial Images , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Jeffrey F. Cohn,et al.  The Timing of Facial Motion in posed and Spontaneous Smiles , 2003, Int. J. Wavelets Multiresolution Inf. Process..

[8]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[9]  P. Ekman,et al.  The nature of emotion: Fundamental questions. , 1994 .

[10]  Kostas Karpouzis,et al.  Towards Detecting Clusters of Players using Visual and Gameplay Behavioral Cues , 2012, VS-GAMES.

[11]  Kostas Karpouzis,et al.  Visual Focus of Attention in Non-calibrated Environments using Gaze Estimation , 2014, International Journal of Computer Vision.

[12]  Eva Hudlicka,et al.  Affective game engines: motivation and requirements , 2009, FDG.

[13]  Ana Paiva,et al.  iCat: an affective game buddy based on anticipatory mechanisms , 2008, AAMAS.

[14]  Kostas Karpouzis,et al.  Fusing Visual and Behavioral Cues for Modeling User Experience in Games , 2013, IEEE Transactions on Cybernetics.

[15]  Leonardo Badia,et al.  A Multimodal Interface Device for Online Board Games Designed for Sight-Impaired People , 2010, IEEE Transactions on Information Technology in Biomedicine.

[16]  L. Obler,et al.  Right hemisphere emotional perception: evidence across multiple channels. , 1998, Neuropsychology.

[17]  M. Tarr,et al.  Activation of the middle fusiform 'face area' increases with expertise in recognizing novel objects , 1999, Nature Neuroscience.

[18]  E. Hudlicka AFFECTIVE COMPUTING FOR GAME DESIGN , 2008 .

[19]  Lennart E. Nacke,et al.  More than a feeling: Measurement of sonic user experience and psychophysiology in a first-person shooter game , 2010, Interact. Comput..

[20]  Kostas Karpouzis,et al.  The platformer experience dataset , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[21]  Roddy Cowie,et al.  Recognition of Emotional States in Natural Human-Computer Interaction , 2008 .

[22]  Yuri Ivanov,et al.  Probabilistic combination of multiple modalities to detect interest , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[23]  Munindar P. Singh,et al.  Architecture for Affective Social Games , 2009, AGS.

[24]  K. Scherer,et al.  Cues and channels in emotion recognition. , 1986 .

[25]  Maja Pantic,et al.  Implicit human-centered tagging [Social Sciences] , 2009, IEEE Signal Process. Mag..

[26]  M. Hasselmo,et al.  The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey , 1989, Behavioural Brain Research.

[27]  P. Ekman,et al.  Nonverbal Leakage and Clues to Deception †. , 1969, Psychiatry.

[28]  Kirsten Rassmus-Gröhn,et al.  The sense of touch provides new computer interaction techniques for disabled people , 1999 .

[29]  Staffan Björk,et al.  Pirates! Using the Physical World as a Game Board , 2001, INTERACT.

[30]  B. de Gelder,et al.  Rapid detection of fear in body expressions, an ERP study , 2007, Brain Research.

[31]  Ming-Chang Tsai,et al.  A low-cost force feedback joystick and its use in PC video games , 1995 .

[32]  Bruce H. Thomas,et al.  First Person Indoor/Outdoor Augmented Reality Application: ARQuake , 2002, Personal and Ubiquitous Computing.

[33]  Nicu Sebe,et al.  Communication and Automatic Interpretation of Affect from Facial Expressions , 2010 .

[34]  R. Adolphs Recognizing emotion from facial expressions: psychological and neurological mechanisms. , 2002, Behavioral and cognitive neuroscience reviews.

[35]  Regan L. Mandryk,et al.  A continuous and objective evaluation of emotional experience with interactive play environments , 2006, CHI.

[36]  F. Pollick,et al.  A motion capture library for the study of identity, gender, and emotion perception from biological motion , 2006, Behavior research methods.

[37]  K. Fujii,et al.  Visualization for the analysis of fluid motion , 2005, J. Vis..

[38]  Peter Robinson,et al.  Detecting Affect from Non-stylised Body Motions , 2007, ACII.

[39]  Albert Mehrabian,et al.  Encoding of attitude by a seated communicator via posture and position cues. , 1969 .

[40]  N. Kanwisher,et al.  The Fusiform Face Area: A Module in Human Extrastriate Cortex Specialized for Face Perception , 1997, The Journal of Neuroscience.

[41]  Andreas Oikonomou,et al.  History and alternative game input methods , 2011, 2011 16th International Conference on Computer Games (CGAMES).

[42]  Eva Hudlicka,et al.  To feel or not to feel: The role of affect in human-computer interaction , 2003, Int. J. Hum. Comput. Stud..

[43]  Jeffrey F. Cohn,et al.  Development of perceptual expertise in emotion recognition , 2009, Cognition.

[44]  Paul Lukowicz,et al.  Using Wearable Sensors for Real-Time Recognition Tasks in Games of Martial Arts - An Initial Experiment , 2006, 2006 IEEE Symposium on Computational Intelligence and Games.

[45]  Dimitrios Tzovaras,et al.  Using Modality Replacement to Facilitate Communication between Visually and Hearing-Impaired People , 2011, IEEE MultiMedia.

[46]  Yu-Sheng Lo,et al.  Developing an interactive dental casting educational game , 2010, 2010 3rd International Conference on Computer Science and Information Technology.

[47]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[48]  Georgios N. Yannakakis,et al.  Don’t Classify Ratings of Affect; Rank Them! , 2014, IEEE Transactions on Affective Computing.

[49]  Bruce H. Thomas,et al.  ARQuake: an outdoor/indoor augmented reality first person application , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[50]  Alan J. Dix,et al.  Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me (ACE) , 2005, DiGRA Conference.

[51]  Maja Pantic,et al.  Implicit human-centered tagging [Social Sciences] , 2009, IEEE Signal Processing Magazine.

[52]  Bernt Schiele,et al.  Bridging the gap between virtual and physical games using wearable sensors , 2002, Proceedings. Sixth International Symposium on Wearable Computers,.

[53]  Michael D. McNeese,et al.  Assessment of User Affective and Belief States for Interface Adaptation: Application to an Air Force Pilot Task , 2002, User Modeling and User-Adapted Interaction.

[54]  Paolo Bonato,et al.  Haptic system for hand rehabilitation integrating an interactive game with an advanced robotic device , 2010, 2010 IEEE Haptics Symposium.

[55]  H. Bülthoff,et al.  The contribution of different facial regions to the recognition of conversational expressions. , 2008, Journal of vision.

[56]  Calle Sjöström,et al.  Using haptics in computer interfaces for blind people , 2001, CHI Extended Abstracts.

[57]  Minetada Osano,et al.  A Multi-agent Based Interactive System Towards Child’s Emotion Performances Quantified Through Affective Body Gestures , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[58]  J. Russell A circumplex model of affect. , 1980 .

[59]  D. Perrett,et al.  Dissociable neural responses to facial expressions of sadness and anger. , 1999, Brain : a journal of neurology.

[60]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[61]  Andrea Kleinsmith,et al.  Cross-cultural differences in recognizing affect from body posture , 2006, Interact. Comput..

[62]  Lennart E. Nacke,et al.  The neurobiology of play , 2010, Future Play.

[63]  Rosalind W. Picard Toward computers that recognize and respond to user emotion , 2000, IBM Syst. J..

[64]  Georgios N. Yannakakis,et al.  Multimodal PTSD characterization via the StartleMart game , 2014, Journal on Multimodal User Interfaces.

[65]  G. Aloisio,et al.  The Simulation of a Billiard Game Using a Haptic Interface , 2007, 11th IEEE International Symposium on Distributed Simulation and Real-Time Applications (DS-RT'07).

[66]  Ioannis Pitas,et al.  Multi-modal emotion-related data collection within a virtual earthquake emulator , 2008 .

[67]  Julian Togelius,et al.  Experience-Driven Procedural Content Generation , 2011, IEEE Trans. Affect. Comput..

[68]  A. Young,et al.  Understanding face recognition. , 1986, British journal of psychology.

[69]  Erik Koch,et al.  A new approach on wearable game design and its evaluation , 2006, NetGames '06.

[70]  Marko Turpeinen,et al.  Towards Emotionally Adapted Games , 2004 .

[71]  Nadia Bianchi-Berthouze,et al.  Continuous Recognition of Player's Affective Body Expression as Dynamic Quality of Aesthetic Experience , 2012, IEEE Transactions on Computational Intelligence and AI in Games.

[72]  Sehyung Park,et al.  Design of haptic interface for brickout game , 2009, 2009 IEEE International Workshop on Haptic Audio visual Environments and Games.

[73]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[74]  Regan L. Mandryk,et al.  A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies , 2007, Int. J. Hum. Comput. Stud..

[75]  Antonio Camurri,et al.  Multimodal Analysis of Expressive Gesture in Music and Dance Performances , 2003, Gesture Workshop.

[76]  Anders Drachen,et al.  Methods for Evaluating Gameplay Experience in a Serious Gaming Context , 2010, Int. J. Comput. Sci. Sport.

[77]  Won-Hyung Lee,et al.  A Study on Multi-Touch Interface for Game , 2010, 2010 3rd International Conference on Human-Centric Computing.

[78]  Minetada Osano,et al.  Towards recognizing emotion with affective dimensions through body gestures , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[79]  Abdulmotaleb El Saddik,et al.  Haptic based emotional communication system in Second Life , 2010, 2010 IEEE International Symposium on Haptic Audio Visual Environments and Games.

[80]  P. Ekman,et al.  Detecting deception from the body or face. , 1974 .

[81]  Kostas Karpouzis,et al.  Towards player’s affective and behavioral visual cues as drives to game adaptation , 2012 .

[82]  Loïc Kessous,et al.  Multimodal user’s affective state analysis in naturalistic interaction , 2010, Journal on Multimodal User Interfaces.

[83]  Chui Yin Wong,et al.  Evaluating playability on haptic user interface for mobile gaming , 2010, 2010 International Symposium on Information Technology.

[84]  Andrew J. Calder,et al.  PII: S0042-6989(01)00002-5 , 2001 .