GRASPA 1.0: GRASPA is a Robot Arm graSping Performance BenchmArk

The use of benchmarks is a widespread and scientifically meaningful practice to validate performance of different approaches to the same task. In the context of robot grasping the use of common object sets has emerged in recent years, however no dominant protocols and metrics to test grasping pipelines have taken root yet. In this letter, we present version 1.0 of GRASPA, a benchmark to test effectiveness of grasping pipelines on physical robot setups. This approach tackles the complexity of such pipelines by proposing different metrics that account for the features and limits of the test platform. As an example application, we deploy GRASPA on the iCub humanoid robot and use it to benchmark our grasping pipeline. As closing remarks, we discuss how the GRASPA indicators we obtained as outcome can provide insight into how different steps of the pipeline affect the overall grasping performance.

[1]  Danica Kragic,et al.  Data-Driven Grasp Synthesis—A Survey , 2013, IEEE Transactions on Robotics.

[2]  Siddhartha S. Srinivasa,et al.  The YCB object and Model set: Towards common benchmarks for manipulation research , 2015, 2015 International Conference on Advanced Robotics (ICAR).

[3]  Giorgio Metta,et al.  Merging Physical and Social Interaction for Effective Human-Robot Collaboration , 2018, 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids).

[4]  Jimmy A. Jørgensen,et al.  VisGraB: A benchmark for vision-based grasping , 2012, Paladyn J. Behav. Robotics.

[5]  Ales Leonardis,et al.  One-shot learning and generation of dexterous grasps for novel objects , 2016, Int. J. Robotics Res..

[6]  幸康 堂前 Amazon Picking Challenge 2016 , 2016 .

[7]  Peter K. Allen,et al.  Graspit! A versatile simulator for robotic grasping , 2004, IEEE Robotics & Automation Magazine.

[8]  Gabriel Thomas Bell,et al.  Amazon Picking Challenge , 2015 .

[9]  Leonidas J. Guibas,et al.  ShapeNet: An Information-Rich 3D Model Repository , 2015, ArXiv.

[10]  Sergey Levine,et al.  Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection , 2016, Int. J. Robotics Res..

[11]  James J. Kuffner,et al.  OpenRAVE: A Planning Architecture for Autonomous Robotics , 2008 .

[12]  D. Holz,et al.  RoboCup@Home: Demonstrating Everyday Manipulation Skills in RoboCup@Home , 2012, IEEE Robotics & Automation Magazine.

[13]  Pieter Abbeel,et al.  BigBIRD: A large-scale 3D database of object instances , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[14]  Stefan Ulbrich,et al.  Simox: A Robotics Toolbox for Simulation, Motion and Grasp Planning , 2012, IAS.

[15]  Dieter Fox,et al.  PoseCNN: A Convolutional Neural Network for 6D Object Pose Estimation in Cluttered Scenes , 2017, Robotics: Science and Systems.

[16]  Rüdiger Dillmann,et al.  The KIT object models database: An object model database for object recognition, localization and manipulation in service robotics , 2012, Int. J. Robotics Res..

[17]  Máximo A. Roa,et al.  Grasp quality measures: review and performance , 2014, Autonomous Robots.

[18]  Máximo A. Roa,et al.  A Benchmarking Framework for Systematic Evaluation of Robotic Pick-and-Place Systems in an Industrial Grocery Setting , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[19]  Giulio Sandini,et al.  The iCub humanoid robot: An open-systems platform for research in cognitive development , 2010, Neural Networks.

[20]  Ross B. Girshick,et al.  Mask R-CNN , 2017, 1703.06870.

[21]  Martial Hebert,et al.  Cut, Paste and Learn: Surprisingly Easy Synthesis for Instance Detection , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[22]  Aaron M. Dollar,et al.  Benchmarking grasping and manipulation: Properties of the Objects of Daily Living , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[23]  Peter I. Corke,et al.  The ACRV picking benchmark: A robotic shelf picking benchmark to foster reproducible research , 2016, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[24]  Gerd Hirzinger,et al.  Grasp planning: how to choose a suitable task wrench space , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[25]  Kate Saenko,et al.  Grasp Pose Detection in Point Clouds , 2017, Int. J. Robotics Res..

[26]  James J. Kuffner,et al.  Physically Based Grasp Quality Evaluation Under Pose Uncertainty , 2013, IEEE Transactions on Robotics.

[27]  Xinyu Liu,et al.  Dex-Net 2.0: Deep Learning to Plan Robust Grasps with Synthetic Point Clouds and Analytic Grasp Metrics , 2017, Robotics: Science and Systems.

[28]  Jon Rigelsford,et al.  Modelling and Control of Robot Manipulators , 2000 .

[29]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..