Interactive Segmentation of Textured and Textureless Objects

This article describes interactive object segmentation for autonomous service robots acting in human living environments. The proposed system allows a robot to effectively segment textured and textureless objects in cluttered scenes by leveraging its manipulation capabilities. In this interactive perception approach, RGB and depth (RGB-D) camera based features are tracked while the robot actively induces motions into a scene using its arm. The robot autonomously infers appropriate arm movements which can effectively separate objects. The resulting tracked feature trajectories are assigned to their corresponding object by clustering. In the final step, we reconstruct the dense models of the objects from the previously clustered sparse RGB-D features. The approach is integrated with robotic grasping and is demonstrated on scenes consisting of various textured and textureless objects, showing the advantages of a tight integration between perception, cognition and action.

[1]  Shinichi Hirai,et al.  Robust real time material classification algorithm using soft three axis tactile sensor: Evaluation of the algorithm , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).