Development of character robots for human-robot mutual communication

This paper describes a robot-head system as a communication device for human robot interaction. Most robotic systems with natural modalities have facial expression functions, since facial expressiveness is regarded as a key component to develop personal attachment along with prosodic expressiveness. Most of the conventional facial robots has adopted Ekman's FACS. However, due to some mechanical constraints, there are some limitations to adopt FACS model completely in facial robots. In the first part of this paper, we introduce a character robot - CRF2 that have richer facial expressions than the first prototype character robot -CRFl since the facial design of CRF2 implements eyelids' mechanism and it is based on the 3D deformation model. However due to the fact that the recognition rate of some expressions such as disgust and fear was not improved very much, we concluded that character robot needs other communication channels such as voice to convey robot's emotional states properly to the user in addition to renovating mechanical facial part. As a result we developed a renovated character robot -CRF3. In addition to facial expressiveness, CRF3 is implemented speech synthesis and neck motions. Further CRF3 has visual, auditory and tactile sensors and expanded configuration that can connect additional sensor or actuator modules to the system. To evaluate the applications of CRF, we applied CRF3 to home environments as a home network manager which selects a task based on robot mood.

[1]  Yasuhisa Hasegawa,et al.  Generalized facial expression of character face based on deformation model for human-robot communication , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[2]  Keith Waters,et al.  Computer facial animation , 1996 .

[3]  Yasuhisa Hasegawa,et al.  Mood and task coordination of home robots , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[4]  Yasuhisa Hasegawa,et al.  Facial expression of robot face for human-robot mutual communication , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[5]  Yasuhisa Hasegawa,et al.  Human-robot mutual communication system , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[6]  Demetri Terzopoulos,et al.  Analysis of facial images using physical and anatomical models , 1990, [1990] Proceedings Third International Conference on Computer Vision.

[7]  John Yen,et al.  Emotionally expressive agents , 1999, Proceedings Computer Animation 1999.

[8]  Brian Scassellati,et al.  How to build robots that make friends and influence people , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[9]  Kohki Kikuchi,et al.  Development on face robot for real facial expressions , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[10]  A. Takanishi,et al.  Development of a new human-like head robot WE-4 , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Demetri Terzopoulos,et al.  Analysis and Synthesis of Facial Image Sequences Using Physical and Anatomical Models , 1993, IEEE Trans. Pattern Anal. Mach. Intell..