Interactions and Training with Unmanned Systems and the Nintendo Wiimote

As unmanned systems continue to evolve and their presence becomes more prolific, new methods are needed for training people to interact with these systems. Likewise, new interfaces must be developed to take advantage of the increasing capabilities of these platforms. However, the complexity of such interfaces must not grow in parallel with advancements in unmanned systems technology. A common form of human communication is through the use of arm and hand gestures. Applying gesture-based communication methods to human-to-robot communication may increase the interface capabilities, resulting in less complex, natural and intuitive interfaces. In the context of military operations, hand and arm gestures (such as those listed in the Army Field Manual on Visual Signals, FM 21-60) may be used to communicate tactical information and instructions to robotic team members. We believe that a gesture-based interface provides a natural method for controlling unmanned systems and reduces training time and training costs for military personnel by reusing standard gestures. The research presented explores these hypotheses through interactions with unmanned systems using computermediated gesture recognition. The methodology employs the Nintendo Wii Remote Controller (Wiimote) to retrieve and classify one- and two-handed gestures that are mapped to an unmanned system command set. To ensure interoperability across multiple types of unmanned systems, our technology uses the Joint Architecture for Unmanned Systems (JAUS); an emerging standard that provides a hardware and software independent communication framework. In this paper, a system is presented that uses inexpensive, commercial off-the-shelf (COTS) technology for gesture input to control multiple types of unmanned systems. A detailed discussion of the technology is provided with a focus on operator usability and training. Finally, to explore the efficacy of the interface, a usability study is presented where participants perform a series of tasks to control an unmanned system using arm and hand gestures.

[1]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[2]  D. Powell,et al.  Multi-robot operator control unit , 2006, SPIE Defense + Commercial Sensing.

[3]  Ehud Sharlin,et al.  Exploring the use of tangible user interfaces for human-robot interaction: a comparative study , 2008, CHI.

[4]  Jessie Y. C. Chen,et al.  Robotics operator performance in a military multi-tasking environment , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Bo Sun,et al.  The mixed-initiative experimental testbed for collaborative human robot interactions , 2008, 2008 International Symposium on Collaborative Technologies and Systems.

[6]  B. Trouvain,et al.  Evaluation of multi-robot control and monitoring performance , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.