The Effects of Eye Design on the Perception of Social Robots

Engagement with social robots is influenced by their appearance and shape. While robots are designed with various features, almost all designs have some form of eyes. In this paper, we evaluate eye design variations for tabletop robots in a lab study, with the goal of learning how they influence participants' perception of the robots' personality and functionality. This evaluation is conducted with non-working “paper prototypes”, a common design methodology which enables quick evaluation of a variety of designs. By comparing sixteen eye designs we found: (1) The more lifelike the design of the eyes was, the higher the robot was rated on personable qualities, and the more suitable it was perceived to be for the home; (2) Eye design did not affect how professional and how suitable for the office the robot was perceived to be. We suggest that designers can use paper prototypes as a design methodology to quickly evaluate variations of a particular feature for social robots.

[1]  Karen Holtzblatt,et al.  Contextual design , 1997, INTR.

[2]  K. Shirai,et al.  Controlling gaze of humanoid in communication with human , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[3]  D. Fox,et al.  Towards Personal Service Robots for the Elderly , 1999 .

[4]  Jodi Forlizzi,et al.  All robots are not created equal: the design and perception of humanoid robot heads , 2002, DIS '02.

[5]  Carl F. DiSalvo,et al.  From seduction to fulfillment: the use of anthropomorphic form in design , 2003, DPPI '03.

[6]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[7]  Aaron Powers,et al.  Matching robot appearance and behavior to tasks to improve human-robot cooperation , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[8]  Carolyn Snyder,et al.  Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces , 2003 .

[9]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[10]  Sven Behnke,et al.  Integrating vision and speech for conversations with multiple persons , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Jun-Hyeong Do,et al.  Soft remote control system in the intelligent sweet home , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Kai Oliver Arras,et al.  Do we want to share our lives and bodies with robots? A 2000 people survey , 2005 .

[14]  Yuichiro Yoshikawa,et al.  Responsive Robot Gaze to Interaction Partner , 2006, Robotics: Science and Systems.

[15]  David Lee,et al.  The art of designing robot faces: dimensions for human-robot interaction , 2006, HRI '06.

[16]  Bilge Mutlu,et al.  A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[17]  Chia-Wei Chang,et al.  Development of a patrol robot for home security with network assisted interactions , 2007, SICE Annual Conference 2007.

[18]  S. Kitayama,et al.  Minimal Social Cues in the Dictator Game , 2009 .

[19]  Jodi Forlizzi,et al.  The Snackbot: Documenting the design of a robot for long-term Human-Robot Interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[20]  Takayuki Kanda,et al.  Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  Robin R. Murphy,et al.  Emotive Non-Anthropomorphic Robots Perceived as More Calming, Friendly, and Attentive for Victim Management , 2010, AAAI Fall Symposium: Dialog with Robots.

[22]  Ehud Sharlin,et al.  Exploring the affect of abstract motion in social human-robot interaction , 2011, 2011 RO-MAN.

[23]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Guy Hoffman,et al.  Dumb robots, smart phones: A case study of music listening companionship , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[25]  Rex Hartson,et al.  The UX book, process and guidelines for ensuring a quality user experience by Rex Hartson and Pardha S. Pyla , 2012, SOEN.

[26]  Karl F. MacDorman,et al.  The Uncanny Valley [From the Field] , 2012, IEEE Robotics Autom. Mag..

[27]  An Gie Yong,et al.  A Beginner's Guide to Factor Analysis: Focusing on Exploratory Factor Analysis , 2013 .

[28]  Gabriel Skantze,et al.  Exploring the effects of gaze and pauses in situated human-robot interaction , 2013, SIGDIAL Conference.

[29]  Brian Scassellati,et al.  Speech and Gaze Conflicts in Collaborative Human-Robot Interactions , 2014, CogSci.

[30]  Norman I. Badler,et al.  A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception , 2015, Comput. Graph. Forum.

[31]  Guy Hoffman,et al.  Designing Vyo, a robotic Smart Home assistant: Bridging the gap between device and social agent , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[32]  Guy Hoffman,et al.  Comparing Social Robot, Screen and Voice Interfaces for Smart-Home Control , 2017, CHI.

[33]  Takanori Komatsu,et al.  Designing robot faces suited to specific tasks that these robots are good at , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[34]  Maya Cakmak,et al.  Characterizing the Design Space of Rendered Robot Faces , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[35]  Pepper , 2020, The Smart Wife.