Expressing Certainty of a Mobile Robot with Artificial Subtle Expressions

In this paper, h-ASE(hesitation-based Artificial Subtle Expression) as a novel implementation of ASE is described, in which a robot expresses confidence in its advice to a human. Confidence in advice is one of robot’s useful internal states, and it is an important goal to develop a practical and inexpensive methodology to correctly express it. To achieve this goal, we propose h-ASE in which a robot slowly hesitates by turning to a human before giving advice with low confidence. We conducted experiments to verify the effectiveness of h-ASE and investigate the influences of two independent variables, time-delay and slow-motion, with participants. As a result, we obtained promising results which shows the time-delay factor significantly contributed to h-ASE than the slow-motion factor.

[1]  Takashi Ikegami,et al.  Microslip as a Simulated Artificial Mind , 2008, Adapt. Behav..

[2]  A. Kendon Do Gestures Communicate? A Review , 1994 .

[3]  Tomio Watanabe,et al.  Time Lag Effects of Utterance to Communicative Actions on CG Character-Human Greeting Interaction , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Hiroshi Ishiguro,et al.  Head motions during dialogue speech and nod timing control in humanoid robots , 2010, HRI 2010.

[5]  Cynthia Breazeal,et al.  MeBot: A robotic platform for socially embodied telepresence , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  Yoshinori Kobayashi,et al.  “I will ask you” choosing answerers by observing gaze responses using integrated sensors for museum guide robots , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[7]  Ehud Sharlin,et al.  Robot expressionism through cartooning , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  S. Hayakawa,et al.  Can we feel a gaze pressure from a robot? Development of an eye-contact robot , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[9]  Matthias Scheutz,et al.  The utility of affect expression in natural language interactions in joint human-robot tasks , 2006, HRI '06.

[10]  Seiji Yamada,et al.  Motion Overlap for a Mobile Robot to Express its Mind , 2007, J. Adv. Comput. Intell. Intell. Informatics.

[11]  Hermann Kaindl,et al.  Evaluation of robot body movements supporting communication , 2010, HRI 2010.

[12]  Elizabeth A. Croft,et al.  Did you see it hesitate? - Empirically grounded design of hesitation trajectories for collaborative robots , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  K. Nakadai,et al.  Effect of facial colors on humanoids in emotion recognition using speech , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[14]  Takayuki Kanda,et al.  Providing route directions: Design of robot's utterance, gesture, and timing , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Christoph Bartneck,et al.  Subtle emotional expressions of synthetic characters , 2005, Int. J. Hum. Comput. Stud..

[16]  Tetsuo Ono,et al.  Android as a telecommunication medium with a human-like presence , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).