Boredom Recognition Based on Users' Spontaneous Behaviors in Multiparty Human-Robot Interactions

To recognize boredom in users interacting with machines is valuable to improve user experiences in human-machine long term interactions, especially for intelligent tutoring systems, health-care systems, and social assistants. This paper proposes a two-staged framework and feature design for boredom recognition in multiparty human-robot interactions. At the first stage the proposed framework detects boredom-indicating user behaviors based on skeletal data obtained by motion capture, and then it recognizes boredom in combination with detection results and two types of multiparty information, i.e., gaze direction to other participants and incoming-and-outgoing of participants. We experimentally confirmed the effectiveness of both the proposed framework and the multiparty information. In comparison with a simple baseline method, the proposed framework gained 35% points in the F1 score.

[1]  Julie A. Jacko,et al.  Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, Third Edition , 2012 .

[2]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[3]  Daniel P. W. Ellis,et al.  Pitch-based emphasis detection for characterization of meeting recordings , 2003, 2003 IEEE Workshop on Automatic Speech Recognition and Understanding (IEEE Cat. No.03EX721).

[4]  Sidney K. D'Mello,et al.  Detecting boredom and engagement during writing with keystroke analysis, task appraisals, and stable traits , 2013, IUI '13.

[5]  Kostas Karpouzis,et al.  Multimodal Emotion Recognition from Low-Level Cues , 2011 .

[6]  Alex Pentland,et al.  Coupled hidden Markov models for complex action recognition , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  Shumin Zhai,et al.  RealTourist - A Study of Augmenting Human-Human and Human-Computer Dialogue with Eye-Gaze Overlay , 2005, INTERACT.

[8]  Ning Wang,et al.  Can Virtual Humans Be More Engaging Than Real Ones? , 2007, HCI.

[9]  Eva Hudlicka,et al.  To feel or not to feel: The role of affect in human-computer interaction , 2003, Int. J. Hum. Comput. Stud..

[10]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[11]  Rosalind W. Picard,et al.  Automated Posture Analysis for Detecting Learner's Interest Level , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[12]  Mikio Nakano,et al.  Estimating response obligation in multi-party human-robot dialogues , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[13]  J. Gregory Trafton,et al.  A preliminary system for recognizing boredom , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Anne C. Frenzel,et al.  Types of boredom: An experience sampling approach , 2014 .

[15]  Yi Liu,et al.  Intelligent pedagogical agents with multiparty interaction support , 2004 .

[16]  Cynthia Breazeal,et al.  Recognition of Affective Communicative Intent in Robot-Directed Speech , 2002, Auton. Robots.

[17]  Reinhard Pekrun,et al.  Types of boredom: An experience sampling approach Thomas GoetzAnne C. FrenzelNathan C. Hall • Ulrike E. NettReinhard PekrunAnastasiya A. Lipnevich , 2014 .