Development and Evaluation of a Chase View for UAV Operations in Cluttered Environments

Civilian applications for UAVs will bring these vehicles into low flying areas cluttered with obstacles such as building, trees, power lines, and more importantly civilians. The high accident rate of UAVs means that civilian use will come at a huge risk unless we design systems and protocols that can prevent UAV accidents, better train operators and augment pilot performance. This paper presents two methods for generating a chase view to the pilot for UAV operations in cluttered environments. The chase view gives the operator a virtual view from behind the UAV during flight. This is done by generating a virtual representation of the vehicle and surrounding environment while integrating it with the real-time onboard camera images. Method I presents a real-time mapping approach toward generating the surrounding environment and Method II uses a prior model of the operating environment. Experimental results are presented from tests where subjects flew in a H0 scale environment using a 6 DOF gantry system. Results showed that the chase view improved UAV operator performance over using the traditional onboard camera view.

[1]  Mica R. Endsley,et al.  Design and Evaluation for Situation Awareness Enhancement , 1988 .

[2]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[3]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[4]  Daniel W. Repperger,et al.  Haptic Feedback as a Supplemental Method of Alerting UAV Operators to the Onset of Turbulence , 2000 .

[5]  Heath A. Ruff,et al.  Utilty of a Tactile Display for Cueing Faults , 2002 .

[6]  Kevin W Williams,et al.  A Summary of Unmanned Aircraft Accident/Incident Data: Human Factors Implications , 2004 .

[7]  Richard J. Prazenica,et al.  Vision-Based State Estimation for Autonomous Micro Air Vehicles , 2004 .

[8]  Randal W. Beard,et al.  Semi-autonomous human-UAV interfaces for fixed-wing mini-UAVs , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[9]  Robin R. Murphy,et al.  Human-robot interaction in rescue robotics , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[10]  Fumitoshi Matsuno,et al.  Time Follower's Vision: a teleoperation interface with past images , 2005, IEEE Computer Graphics and Applications.

[11]  T. Kanade,et al.  Vision-Based Kalman Filtering for Aircraft State Estimation and Structure from Motion , 2005 .

[12]  R. Lind,et al.  SLAM for Flight through Urban Environments using Dimensionality Reduction , 2006 .

[13]  Paul Y. Oh,et al.  Hardware-in-the-loop Test Rig to Capture Aerial Robot and Sensor Suite Performance Metrics , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Erik Theunissen,et al.  Synthetic vision to augment sensor-based vision for remotely piloted vehicles , 2006, SPIE Defense + Commercial Sensing.

[15]  R. John Hansman,et al.  SAFETY CONSIDERATIONS FOR OPERATION OF UNMANNED AERIAL VEHICLES IN THE NATIONAL AIRSPACE SYSTEM , 2006 .

[16]  Michael A. Goodrich,et al.  Comparing Situation Awareness for Two Unmanned Aerial Vehicle Human Interface Approaches , 2006 .

[17]  Michael A. Goodrich,et al.  Ecological Interfaces for Improving Mobile Robot Teleoperation , 2007, IEEE Transactions on Robotics.

[18]  Kimon P. Valavanis,et al.  Unmanned Vehicle Controller Design, Evaluation and Implementation: From MATLAB to Printed Circuit Board , 2007, J. Intell. Robotic Syst..

[19]  Paul Y. Oh,et al.  Development of an Unmanned Aerial Vehicle Piloting System with Integrated Motion Cueing for Training and Pilot Evaluation , 2009, J. Intell. Robotic Syst..