Common Data Fusion Framework: An open-source Common Data Fusion Framework for space robotics

Multisensor data fusion plays a vital role in providing autonomous systems with environmental information crucial for reliable functioning. In this article, we summarize the modular structure of the newly developed and released Common Data Fusion Framework and explain how it is used. Sensor data are registered and fused within the Common Data Fusion Framework to produce comprehensive 3D environment representations and pose estimations. The proposed software components to model this process in a reusable manner are presented through a complete overview of the framework, then the provided data fusion algorithms are listed, and through the case of 3D reconstruction from 2D images, the Common Data Fusion Framework approach is exemplified. The Common Data Fusion Framework has been deployed and tested in various scenarios that include robots performing operations of planetary rover exploration and tracking of orbiting satellites.

[1]  Robert E. Mahony,et al.  Nonlinear Complementary Filters on the Special Orthogonal Group , 2008, IEEE Transactions on Automatic Control.

[2]  S LewMichael,et al.  Deep learning for visual understanding , 2016 .

[3]  Pericles A. Mitkas,et al.  Robotic frameworks, architectures and middleware comparison , 2017, ArXiv.

[4]  Sebastian Madgwick,et al.  Estimation of IMU and MARG orientation using a gradient descent algorithm , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[5]  Daniel Cremers,et al.  LSD-SLAM: Large-Scale Direct Monocular SLAM , 2014, ECCV.

[6]  Brian Coltin,et al.  Astrobee Robot Software: A Modern Software System for Space , 2018 .

[7]  F. Dellaert Factor Graphs and GTSAM: A Hands-on Introduction , 2012 .

[8]  育久 満上,et al.  Bundler: Structure from Motion for Unordered Image Collections , 2011 .

[9]  Michael S. Lew,et al.  Deep learning for visual understanding: A review , 2016, Neurocomputing.

[10]  Cyrill Stachniss,et al.  On measuring the accuracy of SLAM algorithms , 2009, Auton. Robots.

[11]  F. Kirchner,et al.  EnviRe-Environment Representation for Long-term Autonomy , 2016 .

[12]  Frank Kirchner,et al.  Design and field testing of a rover with an actively articulated suspension system in a Mars analog terrain , 2018, J. Field Robotics.

[13]  Tara Estlin,et al.  CLARAty: Challenges and Steps toward Reusable Robotic Software , 2006 .

[14]  Jérôme Hugues,et al.  The TASTE Toolset: turning human designed heterogeneous systems into computer built homogeneous software. , 2010 .

[15]  Zoltan-Csaba Marton,et al.  Tutorial: Point Cloud Library: Three-Dimensional Object Recognition and 6 DOF Pose Estimation , 2012, IEEE Robotics & Automation Magazine.

[16]  Christian Beder,et al.  Determining an Initial Image Pair for Fixing the Scale of a 3D Reconstruction from an Image Sequence , 2006, DAGM-Symposium.

[17]  Manuel Werlberger,et al.  Semi-Direct Visual Odometry for Monocular , Wide-angle , and Multi-Camera Systems , 2015 .

[18]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[19]  Malte Wirkus,et al.  DEVELOPMENT OF A CONTROL SOFTWARE FOR A PLANETARY EXPLORATION ROBOT WITH ESROCOS , 2019 .