Vision guided manipulation for planetary robotics - position control

Manipulation systems for planetary exploration operate under severe restrictions. They need to integrate vision and manipulation to achieve the reliability, safety, and predictability required of expensive systems operating on remote planets. They also must operate on very modest hardware that is shared with many other systems, and must operate without human intervention. Typically such systems employ calibrated stereo cameras and calibrated manipulators to achieve precision of the order of one centimeter with respect to instrument placement activities. This paper presents three complementary approaches to vision guided manipulation designed to robustly achieve high precision in manipulation. These approaches are described and compared, both in simulation and on hardware. In situ estimation and adaptation of the manipulator and/or camera models in these methods account for changes in the system configuration, thus ensuring consistent precision for the life of the mission. All the three methods provide several-fold increases in accuracy of manipulator positioning over the standard flight approach.

[1]  Max Bajracharya,et al.  Visual end-effector position error compensation for planetary robotics: Research Articles , 2007 .

[2]  M. Bajracharya,et al.  Kinematic-Vision Residuals Analysis , 2007, 2007 IEEE Aerospace Conference.

[3]  Raul A. Romero,et al.  Athena Mars rover science investigation , 2003 .

[4]  Eric T. Baumgartner,et al.  Sensor-fused navigation and manipulation from a planetary rover , 1998, Other Conferences.

[5]  Eric T. Baumgartner,et al.  Mobile manipulation for the Mars exploration rover , 2006 .

[6]  Nicolas Thomas,et al.  The MVACS Robotic Arm Camera , 2001 .

[7]  Ashitey Trebi-Ollennu,et al.  Mobile manipulation for the Mars exploration rover - a dexterous and robust instrument positioning system , 2006, IEEE Robotics & Automation Magazine.

[8]  Steven B. Skaar,et al.  Camera-Space Manipulation , 1987 .

[9]  Yang Cheng,et al.  FIDO rover field trials as rehearsal for the NASA 2003 Mars Exploration Rovers mission , 2002, Proceedings of the 5th Biannual World Automation Congress.

[10]  M. Klimesh,et al.  Mars Exploration Rover engineering cameras , 2003 .

[11]  Thomas S. Huang,et al.  BOOK REVIEW: Calibration and Orientation of Cameras in Computer Vision , 2001 .

[12]  D. Gennery,et al.  Calibration and Orientation of Cameras in Computer Vision , 2001 .

[13]  Jürgen Beyerer,et al.  Visual Servoing , 2012, Autom..

[14]  Donald B. Gennery,et al.  Generalized Camera Calibration Including Fish-Eye Lenses , 2006, International Journal of Computer Vision.

[15]  M. DiCicco,et al.  Hand-Eye Calibratilon Using Active Vision , 2007, 2007 IEEE Aerospace Conference.

[16]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[17]  S. S. Ravela T. J. Schnackertz,et al.  Temporal Registration for Assembly , 1995 .

[18]  Kevin Nickels,et al.  Hybrid image plane/stereo (HIPS) manipulation for robotic space applications , 2007, Auton. Robots.

[19]  Paul Backes,et al.  Visual end‐effector position error compensation for planetary robotics , 2007, J. Field Robotics.

[20]  François Chaumette,et al.  Visual Servoing and Visual Tracking , 2008, Springer Handbook of Robotics.

[21]  P.H. Smith The Phoenix mission to Mars , 2004, 2004 IEEE Aerospace Conference Proceedings (IEEE Cat. No.04TH8720).

[22]  D. Gennery Least-Squares Camera Calibration Including Lens Distortion and Automatic Editing of Calibration Points , 2001 .

[23]  P. Backes,et al.  Vision-based end-effector position error compensation , 2006, 2006 IEEE Aerospace Conference.