Learning from Users: an Elicitation Study and Taxonomy for Communicating Small Unmanned Aerial System States Through Gestures

This paper presents a gesture set for communicating states to novice users from a small Unmanned Aerial System (sUAS) through an elicitation study comparing gestures created by participants recruited from the general public with varying levels of experience with an sUAS. Previous work in sUAS flight paths sought to communicate intent, destination, or emotion without focusing on concrete states such as Low Battery or Landing. This elicitation study uses a participatory design approach from human-computer interaction to understand how novice users would expect an sUAS to communicate states, and ultimately suggests flight paths and characteristics to indicate those states. We asked users from the general public $(\mathbf{N}=20)$ to create gestures for seven distinct sUAS states to provide insights for human-drone interactions and to present intuitive flight paths and characteristics with the expectation that the sUAS would have general commercial application for inexperienced users. The results indicate relatively strong agreement scores for three sUAS states: Landing (0.455), Area of Interest (0.265), and Low Battery (0.245). The other four states have lower agreement scores, however even they show some consensus for all seven states. The agreement scores and the associated gestures suggest guidance for engineers to develop a common set of flight paths and characteristics for an sUAS to communicate states to novice users.

[1]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[2]  Bilge Mutlu,et al.  Communication of Intent in Assistive Free Flyers , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Robin R. Murphy,et al.  Effects of Speed, Cyclicity, and Dimensionality on Distancing, Time, and Preference in Human-Aerial Vehicle Interactions , 2017, ACM Trans. Interact. Intell. Syst..

[4]  James Everett Young,et al.  Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Stefan Kopp,et al.  Generation and Evaluation of Communicative Robot Gesture , 2012, Int. J. Soc. Robotics.

[6]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[7]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[8]  Pengcheng Luo,et al.  Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[10]  Philip Webb,et al.  Users’ understanding of industrial robot gesture motions and effects on trust , 2014 .

[11]  Siddhartha S. Srinivasa,et al.  Integrating human observer inferences into robot motion planning , 2014, Auton. Robots.

[12]  Peter Robinson,et al.  Cooperative gestures: effective signaling for humanoid robots , 2010, HRI 2010.

[13]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[14]  James A. Landay,et al.  Emotion encoding in Human-Drone Interaction , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Karon E. MacLean,et al.  Gestures for industry Intuitive human-robot communication from human observation , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Luca Pietrantoni,et al.  The semantic specificity of gestures when verbal communication is not possible: the case of emergency evacuation. , 2013, International journal of psychology : Journal international de psychologie.

[17]  Bilge Mutlu,et al.  Communicating Directionality in Flying Robots , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Wendy Ju,et al.  Exploring shared control in automated driving , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Bilge Mutlu,et al.  Modeling and Evaluating Narrative Gestures for Humanlike Robots , 2013, Robotics: Science and Systems.

[20]  Sebastian G. Elbaum,et al.  Investigation of Communicative Flight Paths for Small Unmanned Aerial Systems * This work was supported by NSF NRI 1638099 , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[21]  R. Krauss,et al.  Do conversational hand gestures communicate? , 1991, Journal of personality and social psychology.

[22]  James A. Landay,et al.  Drone & me: an exploration into natural human-drone interaction , 2015, UbiComp.