Use of 3D laser radar for navigation of unmanned aerial and ground vehicles in urban and indoor environments

This paper discusses the integration of Inertial measurements with measurements from a three-dimensional (3D) imaging sensor for position and attitude determination of unmanned aerial vehicles (UAV) and autonomous ground vehicles (AGV) in urban or indoor environments. To enable operation of UAVs and AGVs at any time in any environment a Precision Navigation, Attitude, and Time (PNAT) capability is required that is robust and not solely dependent on the Global Positioning System (GPS). In urban and indoor environments a GPS position capability may not only be unavailable due to shadowing, significant signal attenuation or multipath, but also due to intentional denial or deception. Although deep integration of GPS and Inertial Measurement Unit (IMU) data may prove to be a viable solution an alternative method is being discussed in this paper. The alternative solution is based on 3D imaging sensor technologies such as Flash Ladar (Laser Radar). Flash Ladar technology consists of a modulated laser emitter coupled with a focal plane array detector and the required optics. Like a conventional camera this sensor creates an "image" of the environment, but producing a 2D image where each pixel has associated intensity vales the flash Ladar generates an image where each pixel has an associated range and intensity value. Integration of flash Ladar with the attitude from the IMU allows creation of a 3-D scene. Current low-cost Flash Ladar technology is capable of greater than 100 x 100 pixel resolution with 5 mm depth resolution at a 30 Hz frame rate. The proposed algorithm first converts the 3D imaging sensor measurements to a point cloud of the 3D, next, significant environmental features such as planar features (walls), line features or point features (corners) are extracted and associated from one 3D imaging sensor frame to the next. Finally, characteristics of these features such as the normal or direction vectors are used to compute the platform position and attitude changes. These "delta" position and attitudes are then used calibrate the IMU. Note, that the IMU is not only required to form the point cloud of the environment expressed in the navigation frame, but also to perform association of the features from one flash Ladar frame to the next. This paper will discuss the performance of the proposed 3D imaging sensor feature extraction, position change estimator and attitude change estimator using both simulator data and data collected from a moving platform in an indoor environment. The former consists of data from a simulated IMU and flash Ladar installed on an aerial vehicle for various trajectories through an urban environment. The latter consists of measurements from a CSEM Swissranger 3D imaging sensor and a MicroStrain low-cost IMU. Data was collected on a manually operated aerial vehicle inside the Ohio University School of Electrical Engineering and Computer Science building.

[1]  Maarten Uijt de Haag,et al.  Integration of an Inertial Measurement Unit and 3D Imaging Sensor for Urban and Indoor Navigation of Unmanned Vehicles , 2007 .

[2]  Philip Garcia,et al.  Nonscanned ladar imaging and applications , 1993, Defense, Security, and Sensing.

[3]  Dustin Bates Navigation Using Optical Tracking of Objects at Unknown Locations , 2007 .

[4]  George Studor,et al.  Miniature high-resolution laser radar operating at video rates , 2000, Defense, Security, and Sensing.

[5]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[6]  F. V. Graas,et al.  Assessment of GPS Signal Quality in Urban Environments Using Deeply Integrated GPS/IMU , 2007 .

[7]  Richard D. Richmond,et al.  Eye-safe laser radar 3D imaging , 2001, SPIE Defense + Commercial Sensing.

[8]  George Chapman,et al.  3D flash ladar at Raytheon , 2001, SPIE Defense + Commercial Sensing.

[9]  Maris Juberts,et al.  Status report on next-generation LADAR for driving unmanned ground vehicles , 2004, SPIE Optics East.

[10]  Robert D. Habbit,et al.  3D scannerless LADAR for Orbiter inspection , 2006, SPIE Defense + Commercial Sensing.

[11]  Daijo Hashimoto Three-Dimensional Camera and Video System , 1995 .

[12]  Olaf T. von Ramm,et al.  Three-dimensional imaging system , 1988 .

[13]  Robert D. Habbit,et al.  Utilization of flash ladar for cooperative and uncooperative rendezvous and capture , 2003, SPIE Defense + Commercial Sensing.

[14]  Sabine Van Huffel,et al.  The total least squares problem , 1993 .

[15]  R. Lange,et al.  Solid-state time-of-flight range camera , 2001 .

[16]  Brian C. Redman,et al.  Flash lidar based on multiple-slit streak tube imaging lidar , 2002, SPIE Defense + Commercial Sensing.

[17]  William C. Stone,et al.  Performance Analysis of Next-Generation LADAR for Manufacturing , 2004 .

[18]  M. Smearcheck,et al.  Flash-LADAR Inertial Navigator Aiding , 2006, 2006 IEEE/ION Position, Location, And Navigation Symposium.

[19]  Andrey Soloviev,et al.  Tight Coupling of Laser Scanner and Inertial Measurements for a Fully Autonomous Relative Navigation Solution , 2007 .

[20]  Samuel T. Pfister,et al.  Algorithms for mobile robot localization and mapping, incorporating detailed noise modeling and multi-scale feature extraction , 2006 .

[21]  W. Kabsch A discussion of the solution for the best rotation to relate two sets of vectors , 1978 .