Zoomorphic Gestures for Communicating Cobot States

Communicating the robot state is vital to creating an efficient and trustworthy collaboration between humans and collaborative robots (cobots). Standard approaches for Robot-to-human communication face difficulties in industry settings, e.g., because of high noise levels or certain visibility requirements. Therefore, this letter presents zoomorphic gestures based on dog body language as a possible alternative for communicating the state of appearance-constrained cobots. For this purpose, we conduct a visual communication benchmark comparing zoomorphic gestures, abstract gestures, and light displays. We investigate the modalities regarding intuitive understanding, user experience, and user preference. In a first user study <inline-formula><tex-math notation="LaTeX">$(n = 93)$</tex-math></inline-formula>, we evaluate our proposed design guidelines for all visual modalities. A second user study <inline-formula><tex-math notation="LaTeX">$(n = 214)$</tex-math></inline-formula> constituting the benchmark indicates that intuitive understanding and user experience are highest for both gesture-based modalities. Furthermore, zoomorphic gestures are considerably preferred over other modalities. These findings indicate that zoomorphic gestures with their playful nature are especially suitable for novel users and may decrease initial inhibitions.

[1]  Ho Chit Siu,et al.  Comparative Performance of Human and Mobile Robotic Assistants in Collaborative Fetch-and-Deliver Tasks , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Chris Harrison,et al.  Unlocking the expressivity of point lights , 2012, CHI.

[3]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[4]  Bram Vanderborght,et al.  Design of a collaborative architecture for human-robot assembly tasks , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[5]  Heather Knight,et al.  Expressive Motion for Low Degree-of-Freedom Robots , 2016 .

[6]  Junaed Sattar,et al.  Robot Communication Via Motion: Closing the Underwater Human-Robot Interaction Loop , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[7]  Ehud Sharlin,et al.  How to walk a robot: A dog-leash human-robot interface , 2011, 2011 RO-MAN.

[8]  Andrea Maria Zanchettin,et al.  Acceptability of robotic manipulators in shared working environments through human-like redundancy resolution. , 2013, Applied ergonomics.

[9]  Kris Hauser,et al.  Designing Multimodal Intent Communication Strategies for Conflict Avoidance in Industrial Human-Robot Teams , 2018, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[10]  Masafumi Hashimoto,et al.  Remarks on designing of emotional movement for simple communication robot , 2010, 2010 IEEE International Conference on Industrial Technology.

[11]  Ravi Teja Chadalavada,et al.  That's on my mind! robot to human intention communication through on-board projection on shared floor space , 2015, 2015 European Conference on Mobile Robots (ECMR).

[12]  Karon E. MacLean,et al.  Gestures for industry Intuitive human-robot communication from human observation , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Alessandro Gasparetto,et al.  Trajectory Planning in Robotics , 2012, Mathematics in Computer Science.

[14]  Robert Muszynski,et al.  Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour , 2016, Comput. Hum. Behav..

[15]  Aaron Steinfeld,et al.  Inducing Bystander Interventions During Robot Abuse with Social Mechanisms , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Maja J. Mataric,et al.  ModLight: Designing a modular light signaling tool for human-robot interaction , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[17]  Jun Hu,et al.  Exploring the abuse of robots , 2008 .

[18]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[19]  Maya Cakmak,et al.  Trajectories and keyframes for kinesthetic teaching: A human-robot interaction perspective , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[20]  Peter Stone,et al.  Passive Demonstrations of Light-Based Robot Signals for Improved Human Interpretability , 2018, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[21]  Manuela M. Veloso,et al.  Mobile Service Robot State Revealing Through Expressive Lights: Formalism, Design, and Evaluation , 2018, Int. J. Soc. Robotics.

[22]  Elizabeth Fanning Formatting a Paper-based Survey Questionnaire: Best Practices , 2005 .

[23]  Christoph Bartneck,et al.  Meta analysis of the usage of the Godspeed Questionnaire Series , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[24]  Jamie Poston,et al.  Too big to be mistreated? Examining the role of robot size on perceptions of mistreatment , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[25]  Kerstin Dautenhahn,et al.  Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[26]  Francesco De Pace,et al.  An Augmented Interface to Display Industrial Robot Faults , 2018, AVR.

[27]  Neil M. Robertson,et al.  A proposed gesture set for the control of industrial collaborative robots , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[28]  James Everett Young,et al.  A Dog Tail for Utility Robots: Exploring Affective Properties of Tail Movement , 2013, INTERACT.

[29]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[30]  Sebastian Pape,et al.  German Translation of the Unified Theory of Acceptance and Use of Technology 2 (UTAUT2) Questionnaire , 2018 .

[31]  J. Kätsyri,et al.  A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness , 2015, Front. Psychol..

[32]  Alessandro Vinciarelli,et al.  The More I Understand it, the Less I Like it: The Relationship Between Understandability and Godspeed Scores for Robotic Gestures , 2018, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[33]  Ana Paiva,et al.  Expressive Lights for Revealing Mobile Service Robot State , 2015, ROBOT.

[34]  Siddhartha S. Srinivasa,et al.  Effects of Robot Motion on Human-Robot Collaboration , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[35]  Bilge Mutlu,et al.  Communicating Directionality in Flying Robots , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).