CapriDB - Capture, Print, Innovate: A Low-Cost Pipeline and Database for Reproducible Manipulation Research

We present a novel approach and database which combines the inexpensive generation of 3D object models via monocular or RGB-D camera images with 3D printing and a state of the art object tracking algorithm. Unlike recent efforts towards the creation of 3D object databases for robotics, our approach does not require expensive and controlled 3D scanning setups and enables anyone with a camera to scan, print and track complex objects for manipulation research. The proposed approach results in highly detailed mesh models whose 3D printed replicas are at times difficult to distinguish from the original. A key motivation for utilizing 3D printed objects is the ability to precisely control and vary object properties such as the mass distribution and size in the 3D printing process to obtain reproducible conditions for robotic manipulation research. We present CapriDB - an extensible database resulting from this approach containing initially 40 textured and 3D printable mesh models together with tracking features to facilitate the adoption of the proposed approach.

[1]  Matei T. Ciocarlie,et al.  The Columbia grasp database , 2009, 2009 IEEE International Conference on Robotics and Automation.

[2]  Daniel Sidobre,et al.  Efficient models for grasp planning with a multi-fingered hand , 2012, Robotics Auton. Syst..

[3]  Mirko Wächter,et al.  A skeleton-based approach to grasp known objects with a humanoid robot , 2012, 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012).

[4]  Danica Kragic,et al.  Grasp moduli spaces and spherical harmonics , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Javier Díaz,et al.  Real-Time Model-Based Rigid Object Pose Estimation and Tracking Combining Dense and Sparse Visual Cues , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Gerd Hirzinger,et al.  Grasping the dice by dicing the grasp , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[7]  Kenneth Y. Goldberg,et al.  Cloud-based robot grasping with the google object recognition engine , 2013, 2013 IEEE International Conference on Robotics and Automation.

[8]  Eduardo Ros,et al.  Real-Time Pose Detection and Tracking of Hundreds of Objects , 2016, IEEE Transactions on Circuits and Systems for Video Technology.

[9]  Danica Kragic,et al.  Grasp Moduli Spaces , 2013, Robotics: Science and Systems.

[10]  Siddhartha S. Srinivasa,et al.  Benchmarking in Manipulation Research: Using the Yale-CMU-Berkeley Object and Model Set , 2015, IEEE Robotics & Automation Magazine.

[11]  Aaron M. Dollar,et al.  The Yale human grasping dataset: Grasp, object, and task data in household and machine shop environments , 2015, Int. J. Robotics Res..

[12]  Danica Kragic,et al.  Real-time tracking meets online grasp planning , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[13]  Danica Kragic,et al.  Minimum volume bounding box decomposition for shape approximation in robot grasping , 2008, 2008 IEEE International Conference on Robotics and Automation.

[14]  Henrik I. Christensen,et al.  Automatic grasp planning using shape primitives , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[15]  Rüdiger Dillmann,et al.  The KIT object models database: An object model database for object recognition, localization and manipulation in service robotics , 2012, Int. J. Robotics Res..