Elastic Tactile Simulation Towards Tactile-Visual Perception

Tactile sensing plays an important role in robotic perception and manipulation tasks. To overcome the real-world limitations of data collection, simulating tactile response in a virtual environment comes as a desirable direction of robotic research. In this paper, we propose Elastic Interaction of Particles (EIP) for tactile simulation, which is capable of reflecting the elastic property of the tactile sensor as well as characterizing the fine-grained physical interaction during contact. Specifically, EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact. With the tactile simulation by EIP, we further propose a tactile-visual perception network that enables information fusion between tactile data and visual images. The perception network is based on a global-to-local fusion mechanism where multi-scale tactile features are aggregated to the corresponding local region of the visual modality with the guidance of tactile positions and directions. The fusion method exhibits superiority regarding the 3D geometric reconstruction task. Our code for EIP is available at https://github.com/yikaiw/EIP.

[1]  Alexey Stomakhin,et al.  Energetically consistent invertible elasticity , 2012, SCA '12.

[2]  Wei Liu,et al.  Pixel2Mesh: Generating 3D Mesh Models from Single RGB Images , 2018, ECCV.

[3]  Vincent Duchaine,et al.  Grasp stability assessment through the fusion of proprioception and tactile signals using convolutional neural networks , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  David Watkins-Valls,et al.  Multi-Modal Geometric Learning for Grasping and Manipulation , 2018, 2019 International Conference on Robotics and Automation (ICRA).

[5]  Chao Yang,et al.  A Dual-Modal Vision-Based Tactile Sensor for Robotic Hand Grasping , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Fernando Torres Medina,et al.  Tactile-Driven Grasp Stability and Slip Prediction , 2019, Robotics.

[7]  Nathan F. Lepora,et al.  Sim-to-Real Transfer for Optical Tactile Sensing , 2020, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Antonio Morales,et al.  Model of tactile sensors using soft contacts and its application in robot grasping simulation , 2013, Robotics Auton. Syst..

[9]  Wojciech Matusik,et al.  Learning the signatures of the human grasp using a scalable tactile glove , 2019, Nature.

[10]  N. Higham Computing the polar decomposition with applications , 1986 .

[11]  Alexey Stomakhin,et al.  A material point method for snow simulation , 2013, ACM Trans. Graph..

[12]  Fuchun Sun,et al.  Learning Deep Multimodal Feature Representation with Asymmetric Multi-layer Fusion , 2020, ACM Multimedia.

[13]  Michael M. Kazhdan,et al.  Screened poisson surface reconstruction , 2013, TOGS.

[14]  Mårten Björkman,et al.  Object shape estimation and modeling, based on sparse Gaussian process implicit surfaces, combining visual data and tactile exploration , 2020, Robotics Auton. Syst..

[15]  Edward H. Adelson,et al.  3D Shape Perception from Monocular Vision, Touch, and Shape Priors , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[16]  Danica Kragic,et al.  Enhancing visual perception of shape through tactile glances , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Véronique Perdereau,et al.  Simulation of Tactile Sensing Arrays for Physical Interaction Tasks , 2020, 2020 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM).

[18]  Ahsan Habib,et al.  SkinSim: A simulation environment for multimodal robot skin , 2014, 2014 IEEE International Conference on Automation Science and Engineering (CASE).

[19]  Raffaello D'Andrea,et al.  Learning the sense of touch in simulation: a sim-to-real strategy for vision-based tactile sensing , 2020, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Vladlen Koltun,et al.  Open3D: A Modern Library for 3D Data Processing , 2018, ArXiv.

[21]  Song-Chun Zhu,et al.  A tale of two explanations: Enhancing human trust by explaining robot behavior , 2019, Science Robotics.

[22]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[23]  Andre Pradhana,et al.  A moving least squares material point method with displacement discontinuity and two-way rigid body coupling , 2018, ACM Trans. Graph..

[24]  Jeannette Bohg,et al.  Three-dimensional object reconstruction of symmetric objects by fusing visual and tactile sensing , 2014, Int. J. Robotics Res..

[25]  Fuchun Sun,et al.  Deep Multimodal Fusion by Channel Exchanging , 2020, NeurIPS.

[26]  Edward H. Adelson,et al.  GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force , 2017, Sensors.

[27]  Wolfram Burgard,et al.  Self-Supervised Model Adaptation for Multimodal Semantic Segmentation , 2018, International Journal of Computer Vision.

[28]  Deli Zhao,et al.  Sparse Coding and Dictionary Learning with Linear Dynamical Systems , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[29]  Graham W. Taylor,et al.  Deep Multimodal Learning: A Survey on Recent Advances and Trends , 2017, IEEE Signal Processing Magazine.

[30]  Adriana Romero,et al.  3D Shape Reconstruction from Vision and Touch , 2020, NeurIPS.

[31]  Frédo Durand,et al.  Taichi , 2019, ACM Trans. Graph..

[32]  Di Guo,et al.  Recent progress on tactile object recognition , 2017 .

[33]  Sergey Levine,et al.  Manipulation by Feel: Touch-Based Control with Deep Predictive Models , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[34]  Hong Zhang,et al.  Control of contact via tactile sensing , 2000, IEEE Trans. Robotics Autom..