Investigating Implicit Cues for User State Estimation in Human-Robot Interaction Using Physiological Measurements

Achieving and maintaining user engagement is a key goal of human-robot interaction. This paper presents a method for determining user engagement state from physiological data (including galvanic skin response and skin temperature). In the reported study, physiological data were measured while participants played a wire puzzle game moderated by either a simulated or embodied robot, both with varying personalities. The resulting physiological data were segmented and classified based on position within trial using the K-Nearest Neighbors algorithm. We found it was possible to estimate the user's engagement state for trials of variable length with an accuracy of 84.73%. In future experiments, this ability would allow assistive robot moderators to estimate the user's likelihood of ending an interaction at any given point during the interaction. This knowledge could then be used to adapt the behavior of the robot in an attempt to re-engage the user.

[1]  J. Cacioppo,et al.  Principles of psychophysiology : physical, social, and inferential elements , 1990 .

[2]  A. Damasio Descartes’ Error. Emotion, Reason and the Human Brain. New York (Grosset/Putnam) 1994. , 1994 .

[3]  G. Fricchione Descartes’ Error: Emotion, Reason and the Human Brain , 1995 .

[4]  S. Kiesler,et al.  Mental Models and Cooperation with Robotic Assistants , 2001 .

[5]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  F. Toates Biological Psychology: An Integrative Approach , 2001 .

[7]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[8]  Jonathan Klein,et al.  Frustrating the user on purpose: a step toward building an affective computer , 2002, Interact. Comput..

[9]  Clifford Nass,et al.  Designing social presence of social actors in human computer interaction , 2003, CHI '03.

[10]  Richard T. Vaughan,et al.  The Player/Stage Project: Tools for Multi-Robot and Distributed Sensor Systems , 2003 .

[11]  Christine L. Lisetti,et al.  Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals , 2004, EURASIP J. Adv. Signal Process..

[12]  Andrew Howard,et al.  Design and use paradigms for Gazebo, an open-source multi-robot simulator , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[13]  M.J. Mataric,et al.  Hands-off assistive robotics for post-stroke arm rehabilitation , 2005, 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005..

[14]  D. Feil-Seifer,et al.  Defining socially assistive robotics , 2005, 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005..

[15]  S. Cerutti,et al.  Quantitative evaluation of distant student psychophysical responses during the e-learning processes , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.

[16]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Adriana Tapus,et al.  User Personality Matching with a Hands-Off Robot for Post-stroke Rehabilitation Therapy , 2006, ISER.

[18]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.