Selection for Interactive Object Segmentation in Clutter

Robots operating in human environments are often required to recognise, grasp and manipulate objects. Identifying the locations of objects amongst their complex surroundings is therefore an important capability. However, when environments are unstructured and cluttered, as is typical for indoor human environments, reliable and accurate object segmentation is not always possible because the scene representation is often incomplete or ambiguous. We overcome the limitations of static object segmentation by enabling a robot to directly interact with the scene with non-prehensile actions. Our method does not rely on object models to infer object existence. Rather, interaction induces scene motion and this provides an additional clue for associating observed parts to the same object. We use a probabilistic segmentation framework in order to identify segmentation uncertainty. This uncertainty is then used to guide a robot while it manipulates the scene. Our probabilistic segmentation approach recursively updates the segmentation given the motion cues and the segmentation is monitored during interaction, thus providing online feedback. Experiments performed with RGB-D data show that the additional source of information from motion enables more certain object segmentation that was otherwise ambiguous. We then show that our interaction approach based on segmentation uncertainty maintains higher quality segmentation than competing methods with increasing clutter.

[1]  Ales Ude,et al.  Physical interaction for segmentation of unknown textured and non-textured rigid objects , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Stéphane Doncieux,et al.  Segmenting Objects through an Autonomous Agnostic Exploration Conducted by a Robot , 2017, 2017 First IEEE International Conference on Robotic Computing (IRC).

[3]  Masayuki Inaba,et al.  Retrieving unknown objects using robot in-the-loop based interactive segmentation , 2016, 2016 IEEE/SICE International Symposium on System Integration (SII).

[4]  Markus Vincze,et al.  Segmentation of unknown objects in indoor environments , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[6]  Ales Ude,et al.  Object segmentation and learning through feature grouping and manipulation , 2010, 2010 10th IEEE-RAS International Conference on Humanoid Robots.

[7]  Kei Okada,et al.  Segmentation of Textured and Textureless Objects through Interactive Perception , 2012 .

[8]  James M. Rehg,et al.  Guided pushing for object singulation , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Gaurav S. Sukhatme,et al.  Using Manipulation Primitives for Object Sorting in Cluttered Environments , 2015, IEEE Transactions on Automation Science and Engineering.

[10]  Jun Morimoto,et al.  Integrating surface-based hypotheses and manipulation for autonomous segmentation and learning of object representations , 2012, 2012 IEEE International Conference on Robotics and Automation.

[11]  Wei Sun,et al.  Autoscanning for coupled scene reconstruction and proactive object analysis , 2015, ACM Trans. Graph..

[12]  Oliver Brock,et al.  Online interactive perception of articulated objects with multi-level recursive estimation based on task-specific priors , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Oliver Brock,et al.  Interactive Perception: Leveraging Action in Perception and Perception in Action , 2016, IEEE Transactions on Robotics.

[14]  Zoltan-Csaba Marton,et al.  Tracking-based interactive segmentation of textureless objects , 2013, 2013 IEEE International Conference on Robotics and Automation.

[15]  Gunnar Farnebäck,et al.  Two-Frame Motion Estimation Based on Polynomial Expansion , 2003, SCIA.

[16]  Dieter Fox,et al.  Interactive singulation of objects from a pile , 2012, 2012 IEEE International Conference on Robotics and Automation.

[17]  Oliver Brock,et al.  Interactive segmentation for manipulation in unstructured environments , 2009, 2009 IEEE International Conference on Robotics and Automation.

[18]  Paul M. Fitzpatrick,et al.  First contact: an active vision approach to segmentation , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[19]  Daniel P. Huttenlocher,et al.  Efficient Graph-Based Image Segmentation , 2004, International Journal of Computer Vision.

[20]  Giorgio Metta,et al.  Better Vision through Manipulation , 2003, Adapt. Behav..

[21]  Markus Vincze,et al.  Attention-driven segmentation of cluttered 3D scenes , 2012, Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012).

[22]  Oliver Kroemer,et al.  Probabilistic Segmentation and Targeted Exploration of Objects in Cluttered Environments , 2014, IEEE Transactions on Robotics.

[23]  Stefan Schaal,et al.  Probabilistic Articulated Real-Time Tracking for Robot Manipulation , 2016, IEEE Robotics and Automation Letters.

[24]  J. Andrew Bagnell,et al.  Interactive segmentation, tracking, and kinematic modeling of unknown 3D articulated objects , 2013, 2013 IEEE International Conference on Robotics and Automation.

[25]  Wolfram Burgard,et al.  Learning to Singulate Objects using a Push Proposal Network , 2017, ISRR.

[26]  Timothy Patten,et al.  EasyLabel: A Semi-Automatic Pixel-wise Object Annotation Tool for Creating Robotic RGB-D Datasets , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[27]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.