Collaborative Micro Aerial Vehicle Exploration of Outdoor Environments

Field personnel, such as soldiers, police SWAT teams, and first responders, face challenging, dangerous environments, often with little advance knowledge or information about their surroundings. Currently, this Intelligence, Surveillance & Reconnaissance (ISR) information is provided by satellite imagery and prior or second-hand experiences. Although satellite imagery is currently the preferred method for gaining Situational Awareness (SA) about an outdoor environment, it has many shortcomings. Unclassified satellite imagery maps available to these field personnel are flat images, with no elevation information and fixed points of view. These maps are often outdated, and, due to shadows and shading, give false impressions of elevations and details of the environment. Critical features of buildings, such as doorways and windows are hidden from view. Combined, these flaws often give field personnel a false mental model of their environment. Given the need of these personnel to simultaneously perform a primary task, such as finding a Person of Interest (POI), as well as explore the environment, an autonomous robot would allow these groups to better perform ISR and improve their SA in real-time. Recent efforts have led to the creation of Micro Aerial Vehicles (MAVs), a class of Unmanned Aerial Vehicle (UAV), which are small and have autonomous capabilities. At most a few feet in size, a MAV can hover in place, perform Vertical Take-Off and Landing, and easily rotate with a small sensor payload. The compact size of these vehicles and their maneuvering capabilities make them well-suited for performing highly localized ISR missions with MAV operator working within the same environment as the vehicle. Unfortunately, existing interfaces for MAVs ignore the needs of field operators, requiring bulky equipment and the operator’s full attention. To be able to collaboratively explore an environment with a MAV, an operator needs a mobile interface which can support the need for divided attention. To address this need, a Cognitive Task Analysis (CTA) was performed with the intended users of the interface to assess their needs, as well as the roles and functions a MAV could provide. Based on this CTA, a set of functional and information requirements were created which outlined the necessities of an interface for exploring an environment with a MAV. Based on these requirements, the Micro Aerial Vehicle Exploration of an Unknown Environment (MAVVUE) interface was designed and implemented. Using MAV-VUE, operators can navigate the MAV using waypoints, which requires little attention. When the operator needs more fine-grained control over the MAV’s location and orientation, in order to obtain imagery or learn more about an environment, he or she can use the Nudge Control mode. Nudge Control uses Perceived First Order (PFO) control to allow an operator effectively “fly” a MAV with no risk to the vehicle. PFO control, which was invented for MAV-VUE, utilizes a 0 order feedback control loop to fly the MAV, while presenting 1 order controls to the operator. A usability study was conducted to evaluate MAV-VUE. Participants were shown a demonstration of the interface and only given three minutes of training before they performed the primary task. During

[1]  Claude Sammut,et al.  Effective user interface design for rescue robotics , 2006, HRI '06.

[2]  Stuart H. Young,et al.  FOCU:S—future operator control unit: soldier , 2009, Defense + Commercial Sensing.

[3]  Raja Parasuraman,et al.  Human Versus Automation in Responding to Failures: An Expected-Value Analysis , 2000, Hum. Factors.

[4]  M.O. Efe,et al.  Robust low altitude behavior control of a quadrotor rotorcraft through sliding modes , 2007, 2007 Mediterranean Conference on Control & Automation.

[5]  Brian R. Johnson,et al.  Tele-operators' judgments of their ability to drive through apertures , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  David R. Morse,et al.  Using while moving: HCI issues in fieldwork environments , 2000, TCHI.

[7]  B. Laeng,et al.  A Redrawn Vandenberg and Kuse Mental Rotations Test - Different Versions and Factors That Affect Performance , 1995, Brain and Cognition.

[8]  Kang-Hyun Jo,et al.  Manipulative hand gesture recognition using task knowledge for human computer interaction , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[9]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[10]  B. Remes,et al.  Design, Aerodynamics, and Vision-Based Control of the DelFly , 2009 .

[11]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[12]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[13]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[14]  K B Bennett,et al.  Graphical Displays: Implications for Divided Attention, Focused Attention, and Problem Solving , 1992, Human factors.

[15]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[16]  M. Peters,et al.  Applications of mental rotation figures of the Shepard and Metzler type and description of a mental rotation stimulus library , 2008, Brain and Cognition.

[17]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[18]  Jean Scholtz,et al.  Evaluation of human-robot interaction awareness in search and rescue , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[19]  Holly A. Yanco,et al.  Evolving interface design for robot search tasks: Research Articles , 2007 .

[20]  Mary Anne Buttigieg,et al.  Emergent Features in Visual Display Design for Two Types of Failure Detection Tasks , 1991, Human factors.

[21]  John M. Flach,et al.  Control Theory for Humans: Quantitative Approaches To Modeling Performance , 2002 .

[22]  Hande Kaymaz-Keskinpala,et al.  PDA-based human-robotic interface , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[23]  Terrence Fong,et al.  A Personal User Interface for Collaborative Human-Robot Exploration , 2001 .

[24]  Thomas B. Sheridan,et al.  Human and Computer Control of Undersea Teleoperators , 1978 .

[25]  Win Keller,et al.  Developing the Class I Unmanned Aerial System (UAS) , 2008 .

[26]  Thomas B. Sheridan,et al.  Space teleoperation through time delay: review and prognosis , 1993, IEEE Trans. Robotics Autom..

[27]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[28]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[29]  Cristin Anne Smith An Ecological Perceptual Aid for Precision Vertical Landings , 2006 .

[30]  Roland Siegwart,et al.  PID vs LQ control techniques applied to an indoor micro quadrotor , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[31]  Peter Deutsch,et al.  GZIP file format specification version 4.3 , 1996, RFC.

[32]  B. Bethke,et al.  Real-time indoor autonomous vehicle test environment , 2008, IEEE Control Systems.

[33]  David F. Feldon,et al.  Cognitive task analysis , 2009 .

[34]  Gavriel Salvendy,et al.  Handbook of Human Factors and Ergonomics , 2005 .

[35]  Mary L. Cummings,et al.  Automation Architecture for Single Operator, Multiple UAV Command and Control, , 2007 .

[36]  Holly A. Yanco,et al.  Rescuing interfaces: A multi-year study of human-robot interaction at the AAAI Robot Rescue Competition , 2007, Auton. Robots.

[37]  Earl Oliver,et al.  A survey of platforms for mobile networks research , 2009, MOCO.

[38]  Holly A. Yanco,et al.  Improved interfaces for human-robot interaction in urban search and rescue , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[39]  M. Hegarty,et al.  A dissociation between mental rotation and perspective-taking spatial abilities , 2004 .

[40]  Alex M. Andrew,et al.  Humans and Automation: System Design and Research Issues , 2003 .

[41]  Sriram Subramanian,et al.  Tilt techniques: investigating the dexterity of wrist-based input , 2009, CHI.

[42]  Wonbae Park,et al.  Gesture-Based User Interfaces for Handheld Devices Using Accelerometer , 2004, PCM.

[43]  Hande Kaymaz-Keskinpala,et al.  Analysis of perceived workload when using a PDA for mobile robot teleoperation , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[44]  Charles E. Thorpe,et al.  Collaborative control: a robot-centric model for vehicle teleoperation , 2001 .

[45]  M. Peters,et al.  The Effects of Sex, Sexual Orientation, and Digit Ratio (2D:4D) on Mental Rotation Performance , 2007, Archives of sexual behavior.

[46]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[47]  Mike Sinclair,et al.  Touch-sensing input devices , 1999, CHI '99.

[48]  M. Hegarty,et al.  A dissociation between object manipulation spatial ability and spatial orientation ability , 2001, Memory & cognition.

[49]  Jean Scholtz,et al.  Theory and evaluation of human robot interactions , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[50]  Thomas B. Sheridan,et al.  Telerobotics, Automation, and Human Supervisory Control , 2003 .

[51]  Mary Anne Buttigieg,et al.  Object Displays Do Not Always Support Better Integrated Task Performance , 1989 .

[52]  Holly A. Yanco,et al.  Multi-touch interaction for robot control , 2009, IUI.

[53]  Mary L. Cummings,et al.  Utilizing Ecological Perception to Support Precision Lunar Landing , 2006 .

[54]  Jessie Y. C. Chen,et al.  Human Performance Issues and User Interface Design for Teleoperated Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[55]  S. Vandenberg,et al.  Mental Rotations, a Group Test of Three-Dimensional Spatial Visualization , 1978, Perceptual and motor skills.

[56]  Jeff Craighead,et al.  A native iPhone Packbot OCU , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[57]  Paula J. Durlach,et al.  Training to Operate a Simulated Micro-Unmanned Aerial Vehicle With Continuous or Discrete Manual Control , 2008 .