Facial expressive robotic head system for human-robot communication and its application in home environment

This paper describes a robotic-head system as a multimodal communication device for human-robot interaction, and the system's potential application in home environments. Most robotic systems for natural user interaction have facial expressions, since facial expressiveness is regarded as a key component to developing personal attachment along with prosodic expressiveness. In the first part of the paper is the description of our robotic head system Character Robot Face (CRF). A deformation approach and a parametric normalization scheme are proposed to produce facial expressions of nonhuman face models with high recognition rates. In the second half of the paper, CRF is endowed with artificial emotions and assigned tasks conceivable in home environments. A coordination mechanism between the robot's mood (an activated emotion) and its task is proposed so that the robot can, by referring to the emotion-task history, select a task depending on its current mood if there is no explicit task command from the user. When the robot performs a task, a particular emotion value gets boosted according to the same emotion-task history so that the emotion is more likely to be activated.

[1]  G. Bower Mood and memory. , 1981, The American psychologist.

[2]  Masahiro Fujita,et al.  An ethological and emotional basis for human-robot interaction , 2003, Robotics Auton. Syst..

[3]  Cynthia Breazeal,et al.  Robot in Society: Friend or Appliance? , 1999 .

[4]  Yasuhisa Hasegawa,et al.  Facial expression of robot face for human-robot mutual communication , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[5]  Yasuhisa Hasegawa,et al.  Human-robot mutual communication system , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[6]  E. Miller,et al.  Task-specific neural activity in the primate prefrontal cortex. , 2000, Journal of neurophysiology.

[7]  Atsuo Takanishi,et al.  Experimental study on robot personality for humanoid head robot , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[8]  A. Sloman Beyond Shallow Models of Emotion , 2001 .

[9]  Sean A. Spence,et al.  Descartes' Error: Emotion, Reason and the Human Brain , 1995 .

[10]  Michael V. Johnston,et al.  Fundamental neuroscience, 2nd edition: Edited by Duane E. Haines, PhD. 582 pp., illustrated. New York: Churchill Livingstone, 2002. $52.00. ISBN 0-443-066035. , 2002 .

[11]  Lola Cannery,et al.  I Show You how I Like You-Can You Read it in My Face , 2001 .

[12]  Kimon P. Valavanis Intelligent Robotic Systems: Theory, Design and Applications , 1995, ICCCN.

[13]  John Yen,et al.  Emotionally expressive agents , 1999, Proceedings Computer Animation 1999.

[14]  Toshio Fukuda,et al.  An intelligent robotic system based on a fuzzy approach , 1999, Proc. IEEE.

[15]  Kerstin Dautenhahn,et al.  The Art of Designing Socially Intelligent Agents: Science, Fiction, and the Human in the Loop , 1998, Appl. Artif. Intell..

[16]  Robert P. Worden Primate Social Intelligence , 1996, Cogn. Sci..

[17]  M. Mori The Buddha in the robot , 1981 .

[18]  Bruce Blumberg,et al.  Motivation driven learning for interactive synthetic characters , 2000, AGENTS '00.

[19]  E. Hirt,et al.  The Role of Mood in Quantitative and Qualitative Aspects of Performance: Single or Multiple Mechanisms? , 1997 .

[20]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[21]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[22]  Fumio Hara,et al.  Study on face robot for active human interface-mechanisms of face robot and expression of 6 basic facial expressions , 1993, Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication.

[23]  Lola Cañamero,et al.  I show you how I like you - can you read it in my face? [robotics] , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[24]  Fumio Hara,et al.  Study on Face Robot for Active Human Interface , 1994 .

[25]  A. Chandra,et al.  A computational architecture to model human emotions , 1997, Proceedings Intelligent Information Systems. IIS'97.

[26]  No-Cheol Park,et al.  Development of 3-axis nano stage for precision positioning in lithography system , 2005, IEEE International Conference Mechatronics and Automation, 2005.

[27]  Fumio Hara,et al.  Real-time facial interaction between human and 3D face robot agent , 1996, Proceedings 5th IEEE International Workshop on Robot and Human Communication. RO-MAN'96 TSUKUBA.

[28]  P. Ekman Unmasking The Face , 1975 .

[29]  Yasuhisa Hasegawa,et al.  Generalized facial expression of character face based on deformation model for human-robot communication , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[30]  Scott S. Snibbe,et al.  A Layered Architecture for Lifelike Robotic Motion , 1999 .

[31]  Hiroyasu Koshimizu,et al.  Facial caricaturing system PICASSO with emotional motion deformation , 1998, 1998 Second International Conference. Knowledge-Based Intelligent Electronic Systems. Proceedings KES'98 (Cat. No.98EX111).

[32]  Ronald C. Arkin,et al.  An Behavior-based Robotics , 1998 .

[33]  T. Dalgleish Basic Emotions , 2004 .

[34]  Atsuo Takanishi,et al.  Robot personalization based on the mental dynamics , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[35]  Paolo Dario,et al.  Humanoids and personal robots: Design and experiments , 2001, J. Field Robotics.

[36]  Yasuhisa Hasegawa,et al.  Mood and task coordination of home robots , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[37]  A. Damasio Descartes’ Error. Emotion, Reason and the Human Brain. New York (Grosset/Putnam) 1994. , 1994 .

[38]  T. Shallice,et al.  Task Switching : A PDP Model , 2001 .

[39]  Takanori Shibata,et al.  Artificial emotional creature for human-machine interaction , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.