Spatial Reasoning for Human-Robot Teams

This chapter presents research designed to study and improve an operator’s ability to navigate or teleoperate a robot that is distant from the operator through the use of a robot intelligence architecture and a virtual 3D interface. To validate the use of the robot intelligence architecture and the 3D interface, four user-studies are presented that compare intelligence modes and interface designs in navigation and exploration tasks. Results from the user studies suggest that performance is improved when the robot assumes some of the navigational responsibilities or the interface presents spatial information as it relates to the pose of the robot in the remote environment. The authors hope that understanding the roles of intelligence and interface design when operating a remote robot will lead to improved humanrobot teams that are useful in a variety of tasks. IDEA GROUP PUBLISHING This paper appears in the publication, Emerging Spatial Information Systems and Applications edited by B. Hilton © 2007, Idea Group Inc. 701 E. Chocolate Avenue, Suite 200, Hershey PA 17033-1240, USA Tel: 717/533-8845; Fax 717/533-8661; URL-http://www.idea-group.com ITB13664 Bruemmer, Few, & N elsen Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited. Introduction Robots have been used in a variety of settings where human access is difficult, impractical, or dangerous. These settings include search and rescue, space exploration, toxic site cleanup, reconnaissance, patrols, and many others (Murphy, 2004). Often, when a robot is used in one of these conditions, the robot is distant from the operator; this is referred to as teleoperation. Ideally, robots could be a useful member of a team because they could be used to accomplish tasks that might be too difficult or impractical for a human to perform. The potential, however, for humans and robots to work as an effective team is limited by the lack of an appropriate means for the operator to visualize the remote environment and how the robot fits within the environment. As an example, several recent research efforts have investigated the human-robot interaction challenges associated with real-world operations including search and rescue and remote characterization of high-radiation environments (Burke, Murphy, Coovert, & Riddle, 2004; Casper & Murphy, 2003; Murphy, 2004; Yanco, Drury, & Scholtz, 2004a). Across these disparate domains, researchers have noted that it is difficult for operators to navigate a remote robot due to difficulty and error in operator understanding of the robot’s position and/or perspective within the remote environment. A primary reason for the difficulty in remote robot teleoperation is that for the overwhelming majority of robotic operations, video remains the primary means of providing information from the remote environment to the operator (Burke, Murphy, Rogers, Lumelsky, & Scholtz, 2004a). Woods, Tittle, Feil, and Roesler (2004) describe the process of using video to navigate a robot as attempting to drive while looking through a “soda straw” because of the limited angular view associated with the camera (Woods et al., 2004). The limited angular view of the camera presents problems for robot teleoperation because obstacles outside of the field of view of the camera still pose navigational threats to the robot even though they are not visible to the operator. To alleviate navigational threats to the robot, current research at the Idaho National Laboratory (INL) is aimed at providing tools that support mixed-initiative control where humans and robots are able to make decisions and take initiative to accomplish a task. The goal is to create a set of capabilities that permit robots to be viewed as trusted teammates rather than passive tools. If this is to happen, the robot as well as the human must be enabled to reason spatially about the task and environment. Furthermore, true teamwork requires a shared understanding of the environment and task between team members in order to understand each others’ intentions (Dennett, 1981). The lack of an effective shared understanding has been a significant impediment to having humans and intelligent robots work together. In response to this challenge, the INL has developed a mixed-initiative robot control architecture that provides a framework for robot intelligence, environment mod21 more pages are available in the full version of this document, which may be purchased using the "Add to Cart" button on the publisher's webpage: www.igi-global.com/chapter/spatial-reasoning-human-robotteams/10139

[1]  Douglas A. Few,et al.  Lessons learned from usability tests with a collaborative cognitive workspace for human-robot teams , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[2]  Kurt Konolige,et al.  Large-Scale Map-Making , 2004, AAAI.

[3]  Jean Scholtz,et al.  Beyond usability evaluation: analysis of human-robot interaction at a major robotics competition , 2004 .

[4]  Michael A. Goodrich,et al.  Experiments in adjustable autonomy , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[5]  Jean Scholtz,et al.  Human-Robot Interactions: Creating Synergistic Cyber Forces , 2002 .

[6]  Robin R. Murphy,et al.  Human-robot interactions during the robot-assisted urban search and rescue response at the World Trade Center , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[7]  Ronald L. Boring,et al.  Shared understanding for collaborative control , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[8]  Robin R. Murphy,et al.  Moonlight in Miami: a field study of human-robot interaction in the context of an urban search and rescue disaster response training exercise , 2004 .

[9]  H Pashler,et al.  Coordinate frame for symmetry detection and object recognition. , 1990, Journal of experimental psychology. Human perception and performance.

[10]  Hobart R. Everett,et al.  Enhancing functionality and autonomy in man-portable robots , 2004, SPIE Defense + Commercial Sensing.

[11]  Kurt Konolige,et al.  Incremental mapping of large cyclic environments , 1999, Proceedings 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation. CIRA'99 (Cat. No.99EX375).

[12]  Donald D. Dudenhoeffer,et al.  Evaluation of supervisory vs. peer-peer interaction with human-robot teams , 2004, 37th Annual Hawaii International Conference on System Sciences, 2004. Proceedings of the.

[13]  Hans P. Moravec Sensor Fusion in Certainty Grids for Mobile Robots , 1988, AI Mag..

[14]  Alberto Elfes,et al.  Sonar-based real-world mapping and navigation , 1987, IEEE J. Robotics Autom..

[15]  Michael A. Goodrich,et al.  Ecological displays for robot interaction: a new perspective , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[16]  David D. Woods,et al.  Envisioning human-robot coordination in future operations , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[17]  John Yen,et al.  CAST: Collaborative Agents for Simulating Teamwork , 2001, IJCAI.

[18]  Holly A. Yanco,et al.  Improved interfaces for human-robot interaction in urban search and rescue , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[19]  Holly A. Yanco,et al.  "Where am I?" Acquiring situation awareness using a remote robot platform , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[20]  D. C. Dermett,et al.  True believers: the intentional strategy and why it works , 1987 .

[21]  Robin R. Murphy,et al.  Human-robot interaction in rescue robotics , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[22]  Nancy J. Cooke,et al.  Measuring Team Knowledge , 2000, Hum. Factors.

[23]  Vladimir J. Lumelsky,et al.  Final report for the DARPA/NSF interdisciplinary study on human-robot interaction , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[24]  Mica R. Endsley,et al.  Design and Evaluation for Situation Awareness Enhancement , 1988 .