Real-time robust motion tracking using 3D point cloud for space debris removal

This paper discusses a motion tracking system for space debris removal using 3D point cloud data. Real-time robust motion tracking is essential for successful rendezvous and docking with a tumbling space debris target. Although various motion tracking methods have been developed for such a mission, the robustness is not guaranteed when a complete target image cannot be obtained within a sensor's limited field of view. We propose a real-time robust motion tracking method that focuses on a payload adapter ring, which is an attachment interface between a satellite and rocket. This method can provide a target pose even when sensor data of the tracking feature is limited as in the above-mentioned case. Moreover, this method is simply designed to enable real-time calculations using limited computational resources. The practical capability of the proposed method was experimentally verified using a small satellite mock-up and compared with the conventional Iterative Closest Point (ICP) algorithm. The experimental results proved that the proposed method can perform real-time motion tracking with high accuracy even when an obtained image is partially missing information. In this experiment, when using the proposed method, the useful field of view for motion tracking was increased by 66 % when compared with the ICP algorithm.

[1]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[2]  Stephane Ruel,et al.  STS-128 on-orbit demonstration of the TriDAR targetless rendezvous and docking sensor , 2010, 2010 IEEE Aerospace Conference.

[3]  Christophe Bonnal,et al.  Active debris removal: Recent progress and current trends , 2013 .

[4]  John Leif Jørgensen,et al.  Pose estimation of an uncooperative spacecraft from actual space imagery , 2014 .

[5]  J. Liou An active debris removal parametric study for LEO environment remediation , 2011 .

[6]  Noriyasu Inaba,et al.  Autonomous Satellite Capture by a Space Robot. , 2000 .

[7]  R.T. Howard,et al.  Orbital Express Advanced Video Guidance Sensor , 2008, 2008 IEEE Aerospace Conference.

[8]  S. Seereeram,et al.  Vision-based relative pose estimation for autonomous rendezvous and docking , 2006, 2006 IEEE Aerospace Conference.

[9]  Galina Okouneva,et al.  Robust vision-based pose estimation of moving objects for Automated Rendezvous & Docking , 2010, 2010 IEEE International Conference on Mechatronics and Automation.

[10]  Zhongping Zhang,et al.  Attitude and Spin Period of Space Debris Envisat Measured by Satellite Laser Ranging , 2014, IEEE Transactions on Geoscience and Remote Sensing.

[11]  Heihachiro Kamimura,et al.  Motion Estimation to a Failed Satellite on Orbit using Stereo Vision and 3D Model Matching , 2006, 2006 9th International Conference on Control, Automation, Robotics and Vision.

[12]  Tong Pan,et al.  Autonomous satellite rendezvous and docking using lidar and model based vision , 2005, SPIE Defense + Commercial Sensing.

[13]  Giorgio Panin,et al.  Vision-based localization for on-orbit servicing of a partially cooperative satellite , 2015 .

[14]  Éric Marchand,et al.  A robust model-based tracker combining geometrical and color edge information , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..