Affect Recognition during Active Game Playing based on Posture Skeleton Data

The affective state of a player during game playing has a significant effect on the player’s motivation and engagement. Recognising player’s emotions during games can help game designers improve the user experience by providing sophisticated behaviours to the game characters and the system itself. This paper presents work-in-progress towards novel recognition of player’s emotions using posture skeleton data as input from non-intrusive interfaces. A database of samples of non-acted posture skeleton data was captured during active game playing using Microsoft Kinect’s sensors. Four observers were asked to annotate the selected postures with an emotion label from a given emotion set. Based on Cohen’s kappa, the agreement level of the observers was above or equal to ‘good’ with overall agreement levels that outperform existing benchmarks. The data was used in a series of experiments for training the system in recognising emotions. The results indicate that the compiled database of postures annotated with emotion labels performs considerably above chance level recognition of emotions and offers interesting research questions for improvements and future directions in the area.