Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles

We present a study exploring upper body 3D spatial interaction metaphors for control and communication with Unmanned Aerial Vehicles (UAV) such as the Parrot AR Drone. We discuss the design and implementation of five interaction techniques using the Microsoft Kinect, based on metaphors inspired by UAVs, to support a variety of flying operations a UAV can perform. Techniques include a first-person interaction metaphor where a user takes a pose like a winged aircraft, a game controller metaphor, where a user's hands mimic the control movements of console joysticks, "proxy" manipulation, where the user imagines manipulating the UAV as if it were in their grasp, and a pointing metaphor in which the user assumes the identity of a monarch and commands the UAV as such. We examine qualitative metrics such as perceived intuition, usability and satisfaction, among others. Our results indicate that novice users appreciate certain 3D spatial techniques over the smartphone application bundled with the AR Drone. We also discuss the trade-offs in the technique design metrics based on results from our study.

[1]  Thi Thanh Mai Nguyen,et al.  A fully automatic hand gesture recognition system for human-robot interaction , 2011, SoICT.

[2]  Jörg Krüger,et al.  Markerless gesture-based motion control and programming of industrial robots , 2011, ETFA2011.

[3]  M. Urban,et al.  Fusion of voice, gesture, and human-computer interface controls for remotely operated robot , 2005, 2005 7th International Conference on Information Fusion.

[4]  Pat Garrity,et al.  Mixed Reality: A Tool for Integrating Live, Virtual & Constructive Domains to Support Training Transformation , 2004 .

[5]  S. Holm A Simple Sequentially Rejective Multiple Test Procedure , 1979 .

[6]  Martin Frassl,et al.  A prototyping environment for interaction between a human and a robotic multi-agent system , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Dan Ionescu,et al.  Finger-based gesture control of a collaborative online workspace , 2012, 2012 7th IEEE International Symposium on Applied Computational Intelligence and Informatics (SACI).

[8]  Anton Nijholt,et al.  Gestures in an Intelligent User Interface , 2010, IUI 2010.

[9]  Ehud Sharlin,et al.  Exploring the use of tangible user interfaces for human-robot interaction: a comparative study , 2008, CHI.

[10]  Yuta Sugiura,et al.  An operating method for a bipedal walking robot for entertainment , 2009, SIGGRAPH ASIA '09.

[11]  Ehud Sharlin,et al.  Collocated interaction with flying robots , 2011, 2011 RO-MAN.