Classifying and sorting cluttered piles of unknown objects with robots: A learning approach

We consider the problem of sorting a densely cluttered pile of unknown objects using a robot. This yet unsolved problem is relevant in the robotic waste sorting business. By extending previous active learning approaches to grasping, we show a system that learns the task autonomously. Instead of predicting just whether a grasp succeeds, we predict the classes of the objects that end up being picked and thrown onto the target conveyor. Segmenting and identifying objects from the uncluttered target conveyor, as opposed to the working area, is easier due to the added structure since the thrown objects will be the only ones present. Instead of trying to segment or otherwise understand the cluttered working area in any way, we simply allow the controller to learn a mapping from an RGBD image in the neighborhood of the grasp to a predicted result-all segmentation etc. in the working area is implicit in the learned function. The grasp selection operates in two stages: The first stage is hardcoded and outputs a distribution of possible grasps that sometimes succeed. The second stage uses a purely learned criterion to choose the grasp to make from the proposal distribution created by the first stage. In an experiment, the system quickly learned to make good pickups and predict correctly, in advance, which class of object it was going to pick up and was able to sort the objects from a densely cluttered pile by color.

[1]  Tapani Raiko,et al.  ZenRobotics Recycler - Robotic Sorting using Machine Learning , 2014 .

[2]  Abhinav Gupta,et al.  Supersizing self-supervision: Learning to grasp from 50K tries and 700 robot hours , 2015, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Ales Leonardis,et al.  One-shot learning and generation of dexterous grasps for novel objects , 2016, Int. J. Robotics Res..

[4]  Charles C. Kemp,et al.  Autonomously learning to visually detect where manipulation will succeed , 2012, Auton. Robots.

[5]  Sergey Levine,et al.  Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection , 2016, Int. J. Robotics Res..

[6]  J. Andrew Bagnell,et al.  Perceiving, learning, and exploiting object affordances for autonomous pile manipulation , 2013, Auton. Robots.

[7]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[8]  Kazuhiko Sumi,et al.  Fast graspability evaluation on single depth maps for bin picking with general grippers , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[9]  Robert Platt,et al.  Using Geometry to Detect Grasp Poses in 3D Point Clouds , 2015, ISRR.

[10]  Pierre Geurts,et al.  Extremely randomized trees , 2006, Machine Learning.

[11]  Jun Li,et al.  Mobile bin picking with an anthropomorphic service robot , 2013, 2013 IEEE International Conference on Robotics and Automation.

[12]  Jun Li,et al.  Active Recognition and Manipulation for Mobile Robot Bin Picking , 2014, Technology Transfer Experiments from the ECHORD Project.

[13]  Ashutosh Saxena,et al.  Efficient grasping from RGBD images: Learning using a new rectangle representation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[14]  Quoc V. Le,et al.  Grasping novel objects with depth segmentation , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.