EMYS—Emotive Head of a Social Robot

This paper presents the design, control, and emotion expressions capabilities of the robotic head EMYS. The concept of motion control system based on FACS theory is proposed. On the basis of this control system six basics emotions are designed for EMYS head. The proposed head shapes are verified in experiments with participation of children aged 8–12. The results of the experiments, perception of the proposed design, and control system are discussed.

[1]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[2]  J. Cunningham,et al.  Differential Salience of Facial Features in Children's Perception of Affective Expression. , 1986 .

[3]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[4]  A. L. Gross,et al.  Children's understanding of emotion from facial expressions and situations: A review , 1991 .

[5]  Marsha Kinder Playing with Power in Movies, Television, and Video Games: From Muppet Babies to Teenage Mutant Ninja Turtles , 1991 .

[6]  Mark H. Davis Empathy: A Social Psychological Approach , 1994 .

[7]  Takanori Shibata,et al.  Spontaneous behavior of robots for cooperation. Emotionally intelligent robot system , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[8]  E. Gat On Three-Layer Architectures , 1997 .

[9]  Alex M. Andrew,et al.  Artificial Intelligence and Mobile Robots , 1999 .

[10]  Sebastian Thrun,et al.  Spontaneous, short-term interaction with mobile robots , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[11]  Ryohei Nakatsu,et al.  Active immersion: the goal of communications with interactive agents , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[12]  Stefan Kopp,et al.  Lifelike Gesture Synthesis and Timing for Conversational Agents , 2001, Gesture Workshop.

[13]  Ipke Wachsmuth,et al.  Gesture and Sign Language in Human-Computer Interaction , 1998, Lecture Notes in Computer Science.

[14]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[15]  Jodi Forlizzi,et al.  All robots are not created equal: the design and perception of humanoid robot heads , 2002, DIS '02.

[16]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[17]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[18]  Jason Osipa Stop Staring: Facial Modeling and Animation Done Right , 2003 .

[19]  Paolo Dario,et al.  Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[20]  Andrea Lockerd Thomaz,et al.  Working collaboratively with humanoid robots , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[21]  Bilge Mutlu,et al.  Perceptions of ASIMO: an exploration on co-operation and competition with humans and humanoid robots , 2006, HRI '06.

[22]  Karsten Berns,et al.  Control of facial expressions of the humanoid robot head ROMAN , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[23]  Lijin Aryananda,et al.  A few days of a robot's life in the human's world: toward incremental individual recognition , 2007 .

[24]  K. Durand,et al.  The development of facial emotion recognition: the role of configural information. , 2007, Journal of experimental child psychology.

[25]  C. Breazeal Role of expressive behaviour for robots that learn from people , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[26]  Radoslaw Niewiadomski,et al.  Greta: an interactive expressive ECA system , 2009, AAMAS.

[27]  P. Costa,et al.  NEO inventories for the NEO Personality Inventory-3 (NEO-PI-3), NEO Five-Factor Inventory-3 (NEO-FFI-3), NEO Personality Inventory-Revised (NEO PI-R) : professional manual , 2010 .

[28]  Ana Paiva,et al.  Expressing Emotions on Robotic Companions with Limited Facial Expression Capabilities , 2011, IVA.

[29]  Mark H. Chignell,et al.  Communication of Emotion in Social Robots through Simple Head and Arm Movements , 2011, Int. J. Soc. Robotics.

[30]  "Facial and body expressions for companions" , 2011 .

[31]  Carsten Zoll,et al.  The Social Role of Robots in the Future—Explorative Measurement of Hopes and Fears , 2011, Int. J. Soc. Robotics.

[32]  Shinichi Hirai,et al.  Robust real time material classification algorithm using soft three axis tactile sensor: Evaluation of the algorithm , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[33]  Katherine B. Martin,et al.  Facial Action Coding System , 2015 .