Rover Localization for Tube Pickup: Dataset, Methods and Validation for Mars Sample Return Planning

The Mars 2020 rover mission is intended to collect samples which will be stored in metal tubes and left on the surface of Mars, for possible retrieval and return to Earth by a future mission. In the proposed Mars Sample Return (MSR) campaign concept, a follow-up mission would collect the sample tubes and load them into a Mars Ascent Vehicle to be launched into orbit for subsequent transfer and return to Earth. In this work, we study the problem of autonomous tube localization and pickup by a “Fetch” rover during the MSR campaign. This is a challenging problem as, over time, the sample tubes may become partially covered by dust and sand, thereby making it difficult to recover their pose by direct visual observation. We propose an indirect approach, in which the Fetch rover localizes itself relative to a map built from Mars 2020 images. The map encodes the position of rocks that are sufficiently tall not to be affected by sand drifts. Because we are confident that tubes will remain immobile until Fetch arrives, their pose within the Mars 2020 map can be used to plan pickup maneuvers without directly observing the tubes in Fetch images. To support this approach, we present a dataset composed of 4160 images collected from two sets of stereo cameras placed at thirteen different view angles, two different heights from the ground, two distances from a tube, in five different lighting conditions, and ground-truthed with a motion capture setup. This dataset allows us to quantify the sensitivity of terrain-relative tube localization with respect to lighting conditions and camera pose.

[1]  Hongdong Li,et al.  Rotation Averaging , 2013, International Journal of Computer Vision.

[2]  Iasonas Kokkinos,et al.  Discriminative Learning of Deep Convolutional Feature Point Descriptors , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[3]  Jeremie Papon,et al.  Martian Fetch: Finding and retrieving sample-tubes on the surface of mars , 2017, 2017 IEEE Aerospace Conference.

[4]  Trevor Darrell,et al.  Fully Convolutional Networks for Semantic Segmentation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[6]  Mark W. Powell,et al.  Localization and 'Contextualization' of Curiosity in Gale Crater, and Other Landed Mars Missions , 2013 .

[7]  Y. Oshman,et al.  Averaging Quaternions , 2007 .

[8]  James U. Korein,et al.  Robotics , 2018, IBM Syst. J..

[9]  Enhanced Engineering Cameras (EECAMs) for the Mars 2020 Rover , 2016 .

[10]  Julius Ziegler,et al.  StereoScan: Dense 3d reconstruction in real-time , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[11]  Nicholas Roy,et al.  Simultaneous tracking and rendering: Real-time monocular localization for MAVs , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[12]  George Drettakis,et al.  Multi-view relighting using a geometry-aware network , 2019, ACM Trans. Graph..

[13]  Paul Newman,et al.  FARLAP: Fast robust localisation using appearance priors , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[14]  Richard Mattingly,et al.  Mars Sample Return as a campaign , 2011, 2011 Aerospace Conference.

[15]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .