Towards vision-based robotic skins: a data-driven, multi-camera tactile sensor

This paper describes the design of a multi-camera optical tactile sensor that provides information about the contact force distribution applied to its soft surface. This information is contained in the motion of spherical particles spread within the surface, which deforms when subject to force. The small embedded cameras capture images of the different particle patterns that are then mapped to the three-dimensional contact force distribution through a machine learning architecture. The design proposed in this paper exhibits a larger contact surface and a thinner structure than most of the existing camera-based tactile sensors, without the use of additional reflecting components such as mirrors. A modular implementation of the learning architecture is discussed that facilitates the scalability to larger surfaces such as robotic skins.

[1]  Fariborz Baghaei Naeini,et al.  A Novel Dynamic-Vision-Based Approach for Tactile Sensing Applications , 2020, IEEE Transactions on Instrumentation and Measurement.

[2]  Giulio Sandini,et al.  An embedded artificial skin for humanoid robots , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[3]  Edward H. Adelson,et al.  GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force , 2017, Sensors.

[4]  Jianhua Li,et al.  GelSlim: A High-Resolution, Compact, Robust, and Calibrated Tactile-sensing Finger , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[5]  Raffaello D'Andrea,et al.  Ground Truth Force Distribution for Learning-Based Tactile Sensing: A Finite Element Approach , 2019, IEEE Access.

[6]  Raffaello D'Andrea,et al.  Transfer learning for vision-based tactile sensing , 2018, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[7]  Sachin Chitta,et al.  Human-Inspired Robotic Grasp Control With Tactile Sensing , 2011, IEEE Transactions on Robotics.

[8]  M. Schulz,et al.  Flexible Dome and Bump Shape Piezoelectric Tactile Sensors Using PVDF-TrFE Copolymer , 2008, Journal of Microelectromechanical Systems.

[9]  Russ Tedrake,et al.  Soft-bubble: A highly compliant dense geometry tactile sensor for robot manipulation , 2019, 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft).

[10]  Matei Ciocarlie,et al.  Touch Sensors with Overlapping Signals: Concept Investigation on Planar Sensors with Resistive or Optical Transduction. , 2018 .

[11]  Yu She,et al.  Exoskeleton-covered soft finger with vision-based proprioception and exteroception , 2019, ArXiv.

[12]  Giulio Sandini,et al.  Tactile Sensing—From Humans to Humanoids , 2010, IEEE Transactions on Robotics.

[13]  Kyungseo Park,et al.  A Large-Scale Fabric-Based Tactile Sensor Using Electrical Resistance Tomography , 2018, AsiaHaptics.

[14]  Van Anh Ho,et al.  Development of a Vision-Based Soft Tactile Muscularis , 2019, 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft).

[15]  Kazuhiro Shimonomura,et al.  Tactile Image Sensors Employing Camera: A Review , 2019, Sensors.

[16]  C. Atkeson,et al.  Optical Skin For Robots : Tactile Sensing And Whole-Body Vision , 2010 .

[17]  Christopher G. Atkeson,et al.  Recent progress in tactile sensing and sensors for robotic manipulation: can we turn tactile sensing into vision?1 , 2019, Adv. Robotics.

[18]  Raffaello D'Andrea,et al.  Design, Motivation and Evaluation of a Full-Resolution Optical Tactile Sensor , 2019, Sensors.

[19]  Vladimir J. Lumelsky,et al.  Sensing, intelligence, motion - how robots and humans move in an unstructured world , 2005 .

[20]  H. Worn,et al.  The working principle of resistive tactile sensor cells , 2005, IEEE International Conference Mechatronics and Automation, 2005.

[21]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.