Failure Handling of Robotic Pick and Place Tasks With Multimodal Cues Under Partial Object Occlusion

The success of a robotic pick and place task depends on the success of the entire procedure: from the grasp planning phase, to the grasp establishment phase, then the lifting and moving phase, and finally the releasing and placing phase. Being able to detect and recover from grasping failures throughout the entire process is therefore a critical requirement for both the robotic manipulator and the gripper, especially when considering the almost inevitable object occlusion by the gripper itself during the robotic pick and place task. With the rapid rising of soft grippers, which rely heavily on their under-actuated body and compliant, open-loop control, less information is available from the gripper for effective overall system control. Tackling on the effectiveness of robotic grasping, this work proposes a hybrid policy by combining visual cues and proprioception of our gripper for the effective failure detection and recovery in grasping, especially using a proprioceptive self-developed soft robotic gripper that is capable of contact sensing. We solved failure handling of robotic pick and place tasks and proposed (1) more accurate pose estimation of a known object by considering the edge-based cost besides the image-based cost; (2) robust object tracking techniques that work even when the object is partially occluded in the system and achieve mean overlap precision up to 80%; (3) contact and contact loss detection between the object and the gripper by analyzing internal pressure signals of our gripper; (4) robust failure handling with the combination of visual cues under partial occlusion and proprioceptive cues from our soft gripper to effectively detect and recover from different accidental grasping failures. The proposed system was experimentally validated with the proprioceptive soft robotic gripper mounted on a collaborative robotic manipulator, and a consumer-grade RGB camera, showing that combining visual cues and proprioception from our soft actuator robotic gripper was effective in improving the detection and recovery from the major grasping failures in different stages for the compliant and robust grasping.

[1]  Bruce A. Draper,et al.  Visual object tracking using adaptive correlation filters , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[2]  Zoltan-Csaba Marton,et al.  Implicit 3D Orientation Learning for 6D Object Detection from RGB Images , 2018, ECCV.

[3]  Valerio Ortenzi,et al.  Hypothesis-based Belief Planning for Dexterous Grasping , 2019, ArXiv.

[4]  Danica Kragic,et al.  Survey on Visual Servoing for Manipulation , 2002 .

[5]  Michael Felsberg,et al.  Accurate Scale Estimation for Robust Visual Tracking , 2014, BMVC.

[6]  Horst Bischof,et al.  Real-Time Tracking via On-line Boosting , 2006, BMVC.

[7]  Christopher Kanan,et al.  Robotic grasp detection using deep convolutional neural networks , 2016, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[8]  Xiaojiao Chen,et al.  A Proprioceptive Bellows (PB) Actuator With Position Feedback and Force Estimation , 2020, IEEE Robotics and Automation Letters.

[9]  Michael Felsberg,et al.  Adaptive Color Attributes for Real-Time Visual Tracking , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Wei Liu,et al.  SSD: Single Shot MultiBox Detector , 2015, ECCV.

[11]  Yi Wu,et al.  Online Object Tracking: A Benchmark , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Zheng Wang,et al.  A Soft-Robotic Gripper With Enhanced Object Adaptation and Grasping Reliability , 2017, IEEE Robotics and Automation Letters.

[13]  B. V. K. Vijaya Kumar,et al.  Cancelable biometric filters for face recognition , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[14]  Liangliang Wang,et al.  Mechanoreception for Soft Robots via Intuitive Body Cues , 2020, Soft robotics.

[15]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Jiri Matas,et al.  Forward-Backward Error: Automatic Detection of Tracking Failures , 2010, 2010 20th International Conference on Pattern Recognition.

[17]  Bin Li,et al.  Event-based Robotic Grasping Detection with Neuromorphic Vision Sensor and Event-Stream Dataset , 2020, ArXiv.

[18]  Nikolaus Correll,et al.  Reducing the Barrier to Entry of Complex Robotic Software: a MoveIt! Case Study , 2014, ArXiv.

[19]  Silvio Savarese,et al.  Learning to Track at 100 FPS with Deep Regression Networks , 2016, ECCV.

[20]  Daniel E. Koditschek,et al.  Visual servoing via navigation functions , 2002, IEEE Trans. Robotics Autom..

[21]  Bruce A. Draper,et al.  Simple real-time human detection using a single correlation filter , 2009, 2009 Twelfth IEEE International Workshop on Performance Evaluation of Tracking and Surveillance.

[22]  Andrew Blake,et al.  Multiscale Categorical Object Recognition Using Contour Fragments , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Hailin Huang,et al.  A High-Payload Proprioceptive Hybrid Robotic Gripper With Soft Origamic Actuators , 2020, IEEE Robotics and Automation Letters.