PredGaze: A Incongruity Prediction Model for User’s Gaze Movement

With digital signage and communication robots, digital agents have gradually become popular and will become more popular. It is important to make humans notice the intentions of agents throughout the interaction between them. This paper is focused on the gaze behavior of an agent and the phenomenon that if the gaze behavior of an agent is different from human expectations, human will have a incongruity and feel the existence of the agent’s intention behind the behavioral changes instinctively. We propose PredGaze, a model of estimating this incongruity which humans have according to the shift in gaze behavior from the human’s expectations. In particular, PredGaze uses the variance in the agent behavior model to express how well humans sense the behavioral tendency of the agent. We expect that this variance will improve the estimation of the incongruity. PredGaze uses three variables to estimate the internal state of how much a human senses the agent’s intention: error, confidence, and incongruity. To evaluate the effectiveness of PredGaze with these three variables, we conducted an experiment to investigate the effects of the timing of gaze behavior change and incongruity. The experimental results indicated that there were significant differences in the subjective scores of the naturalness of agents and incongruity with agents according to the difference in the timing of the agent’s change in its gaze behavior.

[1]  Hector J. Levesque,et al.  Intention is Choice with Commitment , 1990, Artif. Intell..

[2]  A. Kendon Some functions of gaze-direction in social interaction. , 1967, Acta psychologica.

[3]  Seiji Yamada,et al.  How Does the Difference Between Users’ Expectations and Perceptions About a Robotic Agent Affect Their Behavior? , 2011, International Journal of Social Robotics.

[4]  C. Bartneck,et al.  Measuring the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots , 2008, HRI 2008.

[5]  S. Drucker,et al.  The Role of Eye Gaze in Avatar Mediated Conversational Interfaces , 2000 .

[6]  Thomas Wehrle,et al.  Emotion and Facial Expression , 1999, IWAI.

[7]  Yuichiro Yoshikawa,et al.  The effects of responsive eye movement and blinking behavior in a communication robot , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Norihiro Hagita,et al.  Messages embedded in gaze of interface agents --- impression management with agent's gaze , 2002, CHI.

[9]  Akira Ito,et al.  Reactive movements of non-humanoid robots cause intention attribution in humans , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Kazuhiko Shinozawa,et al.  Emergence of Agent Gaze Behavior using Interactive Kinetics-Based Gaze Direction Model , 2020, HRI.

[11]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[12]  M. Mori,et al.  The Uncanny Valley: The Original Essay by Masahiro Mori-IEEE Spectrum , 2017 .

[13]  Takashi Omori,et al.  Computational Modeling of Human-Robot Interaction Based on Active Intention Estimation , 2007, ICONIP.

[14]  Toshiyuki Kondo,et al.  Sustainability and Predictability in a Lasting Human-Agent Interaction , 2008, IVA.

[15]  M. Patterson,et al.  Nonverbal Behavior: A Functional Perspective. , 1984 .