Object exploration using vision and active touch

Achieving object exploration with passive vision and active touch has been under investigation for thirty years. We build upon recent progress in biomimetic active touch that combines perception via Bayesian evidence accumulation with controlling the tactile sensor using perceived stimulus location. Here, passive vision is combined with active touch by providing a visual prior for each perceptual decision, with the precision of this prior setting the relative contribution of each modality. The performance is examined on an edge following task using a tactile fingertip (the TacTip) mounted on a robot arm. We find that the quality of exploration is a U-shaped function of the relative contribution of vision and touch; moreover, multi-modal performance is more robust, completing the contour when touch alone fails. The overall system has several parallels with biological theories of perception, and thus plausibly represents a robot model of visuo-tactile exploration in humans.

[1]  Tsutomu Hasegawa,et al.  Real-time pose estimation of an object manipulated by multi-fingered hand using 3D stereo vision and tactile sensing , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[2]  Sharon A. Stansfield,et al.  A Robotic Perceptual System Utilizing Passive Vision and Active Touch , 1988, Int. J. Robotics Res..

[3]  Lorenzo Rosasco,et al.  Active perception: Building objects' models using tactile exploration , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[4]  Tony J. Dodd,et al.  Active sensorimotor control for tactile exploration , 2017, Robotics Auton. Syst..

[5]  Peter K. Allen,et al.  Integrating Vision and Touch for Object Recognition Tasks , 1988, Int. J. Robotics Res..

[6]  Helge J. Ritter,et al.  Integrating vision, haptics and proprioception into a feedback controller for in-hand manipulation of unknown objects , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Nathan F. Lepora,et al.  Active Bayesian Perception for Simultaneous Object Localization and Identification , 2013, Robotics: Science and Systems.

[8]  Fuchun Sun,et al.  Visual–Tactile Fusion for Object Recognition , 2017, IEEE Transactions on Automation Science and Engineering.

[9]  Jonathan Rossiter,et al.  Development of a tactile sensor based on biologically inspired edge encoding , 2009, 2009 International Conference on Advanced Robotics.

[10]  Martin Buss,et al.  Comparison of surface normal estimation methods for range sensing applications , 2009, 2009 IEEE International Conference on Robotics and Automation.

[11]  Nathan F. Lepora,et al.  Active touch for robust perception under position uncertainty , 2013, 2013 IEEE International Conference on Robotics and Automation.

[12]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[13]  Gregory D. Hager,et al.  Preliminary results on grasping with vision and touch , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[14]  Philippos Mordohai,et al.  A quantitative evaluation of surface normal estimation in point clouds , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Nathan F. Lepora,et al.  Exploratory Tactile Servoing With Active Touch , 2017, IEEE Robotics and Automation Letters.

[16]  Henrik I. Christensen,et al.  Efficient Organized Point Cloud Segmentation with Connected Components , 2013 .

[17]  Helge J. Ritter,et al.  A visuo-tactile control framework for manipulation and exploration of unknown objects , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[18]  Nathan F. Lepora,et al.  Biomimetic Active Touch with Fingertips and Whiskers , 2016, IEEE Transactions on Haptics.

[19]  Peter K. Allen,et al.  Using tactile and visual sensing with a robotic hand , 1997, Proceedings of International Conference on Robotics and Automation.

[20]  Lorenzo Rosasco,et al.  Combining sensory modalities and exploratory procedures to improve haptic object recognition in robotics , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[21]  Danica Kragic,et al.  Enhancing visual perception of shape through tactile glances , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Kaspar Althoefer,et al.  Localizing the object contact through matching tactile features with visual map , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[23]  Danica Kragic,et al.  What's in the container? Classifying object contents from vision and touch , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  James M. Rehg,et al.  Combining tactile sensing and vision for rapid haptic mapping , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[25]  Ruzena Bajcsy,et al.  Object Recognition Using Vision and Touch , 1985, IJCAI.

[26]  Kaspar Althoefer,et al.  Combining touch and vision for the estimation of an object's pose during manipulation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Jian Liang,et al.  Point Cloud Segmentation and Denoising via Constrained Nonlinear Least Squares Normal Estimates , 2013, Innovations for Shape Analysis, Models and Algorithms.

[28]  Hiromi T. Tanaka,et al.  A vision-based haptic exploration , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[29]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.