Integrating grasp planning and visual feedback for reliable manipulation

We present a vision-centric manipulation framework for reliably performing reach-and-grasp tasks in everyday environments. By combining grasp planning and visual feedback algorithms, and constantly considering sensor visibility, the framework can recover from sensor calibration errors and unexpected changes in the environment. Although many current robot systems include planning and vision components, these components are treated independently, which reduces the capability of the system from making informed decisions. Our proposed framework incorporates information in a data-driven way from both planning and vision modalities during the planning and execution phases of the task. The planning phase generates a plan to move the robot manipulator as close as safely possible to the target object such that the target is easily detectable by the on-board sensors. The execution phase is responsible for continuously choosing and validating a grasp for the target while updating the environment with more accurate information. We stress the importance of performing grasp selection for the target during visual-feedback execution because more precise information about the target's location and its surroundings is available. We evaluate our framework on several robot platforms in simulation.

[1]  Oliver Brock,et al.  BiSpace Planning: Concurrent Multi-Space Exploration , 2009 .

[2]  Sukhan Lee,et al.  Robotic execution of everyday tasks by means of external vision/force control , 2008, Intell. Serv. Robotics.

[3]  Masayuki Inaba,et al.  Environment manipulation planner for humanoid robots using task graph that generates action sequence , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[4]  James J. Kuffner,et al.  OpenRAVE: A Planning Architecture for Autonomous Robotics , 2008 .

[5]  Gunilla Borgefors,et al.  An Approximation of the Maximal Inscribed Convex Set of a Digital Object , 2005, ICIAP.

[6]  Dmitry Berenson,et al.  Grasp planning in complex scenes , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[7]  John M. Dolan,et al.  Robust Mission Execution for Autonomous Urban Driving , 2008 .

[8]  Sukhan Lee,et al.  VIsion force control in task-oriented grasping and manipulation , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Dae-Jin Kim,et al.  Eye-in-hand stereo visual servoing of an assistive robot arm in unstructured environments , 2009, 2009 IEEE International Conference on Robotics and Automation.

[10]  Maxim Likhachev,et al.  Motion planning in urban environments: Part I , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Steven M. LaValle,et al.  RRT-connect: An efficient approach to single-query path planning , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[12]  P. Michel Integrating perception & planning for humanoid autonomy , 2008 .

[13]  Kamal K. Gupta,et al.  Global path planning for robust Visual Servoing in complex environments , 2009, 2009 IEEE International Conference on Robotics and Automation.

[14]  Siddhartha S. Srinivasa,et al.  The robotic busboy: Steps towards developing a mobile robotic home assistant , 2008 .

[15]  Klara Kedem,et al.  A Convex Polygon Among Polygonal Obstacles: Placement and High-clearance Motion , 1993, Comput. Geom..

[16]  Advait Jain,et al.  Behaviors for Robust Door Opening and Doorway Traversal with a Force-Sensing Mobile Manipulator , 2008 .

[17]  Geoffrey A. Hollinger,et al.  HERB: a home exploring robotic butler , 2010, Auton. Robots.

[18]  Masayuki Inaba,et al.  Cooking for humanoid robot, a task that needs symbolic and geometric reasonings , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..