Active Affordance Exploration for Robot Grasping

Robotic grasp in complicated un-structured warehouse environments is still a challenging task and attracts lots of attentions from robot vision and machine learning communities. A popular strategy is to directly detect the graspable region for specific end-effector such as suction cup, two-fingered gripper or multi-fingered hand. However, those work usually depends on the accurate object detection and precise pose estimation. Very recently, affordance map which describes the action possibilities that an environment can offer, begins to be used for grasp tasks. But it often fails in cluttered environments and degrades the efficiency of warehouse automation. In this paper, we establish an active exploration framework for robot grasp and design a deep reinforcement learning method. To verify the effectiveness, we develop a new composite hand which combines the suction cup and fingers and the experimental validations on robotic grasp tasks show the advantages of the active exploration method. This novel method significantly improves the grasp efficiency of the warehouse manipulators.

[1]  Shengyong Chen,et al.  Active vision in robotic systems: A survey of recent developments , 2011, Int. J. Robotics Res..

[2]  A. Sadeghi,et al.  Design and development of innovative adhesive suckers inspired by the tube feet of sea urchins , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[3]  F. Grasso Octopus sucker-arm coordination in grasping and manipulation* , 2008 .

[4]  Alberto Rodriguez,et al.  Learning Synergies Between Pushing and Grasping with Self-Supervised Deep Reinforcement Learning , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[5]  Jeremy Hsu Machines on mission possible , 2019 .

[6]  Surya P. N. Singh,et al.  V-REP: A versatile and scalable robot simulation framework , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Giacomo Mantriota,et al.  Theoretical and experimental study of the performance of flat suction cups in the presence of tangential loads , 2011 .

[8]  Fuchun Sun,et al.  Robotic Material Perception Using Active Multimodal Fusion , 2019, IEEE Transactions on Industrial Electronics.

[9]  Ruzena Bajcsy,et al.  Active and exploratory perception , 1992, CVGIP Image Underst..

[10]  Matteo Cianchetti,et al.  Active suction cup actuated by ElectroHydroDynamics phenomenon , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[11]  Honglak Lee,et al.  Deep learning for detecting robotic grasps , 2013, Int. J. Robotics Res..

[12]  Frank W Grasso,et al.  Inspiration, simulation and design for smart robot manipulators from the sucker actuation mechanism of cephalopods , 2007, Bioinspiration & biomimetics.

[13]  Jaydev P. Desai,et al.  Design, fabrication, and implementation of self-sealing suction cup arrays for grasping , 2010, 2010 IEEE International Conference on Robotics and Automation.

[14]  Tomokazu Takahashi,et al.  Vacuum gripper imitated octopus sucker-effect of liquid membrane for absorption- , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[15]  Ian Taylor,et al.  Robotic pick-and-place of novel objects in clutter with multi-affordance grasping and cross-domain image matching , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Hiroki Shigemune,et al.  Stretchable Suction Cup with Electroadhesion , 2018, Advanced Materials Technologies.