ShadowSense

This paper proposes and evaluates the use of image classification for detailed, full-body human-robot tactile interaction. A camera positioned below a translucent robot skin captures shadows generated from human touch and infers social gestures from the captured images. This approach enables rich tactile interaction with robots without the need for the sensor arrays used in traditional social robot tactile skins. It also supports the use of touch interaction with non-rigid robots, achieves high-resolution sensing for robots with different sizes and shape of surfaces, and removes the requirement of direct contact with the robot. We demonstrate the idea with an inflatable robot and a standing-alone testing device, an algorithm for recognizing touch gestures from shadows that uses Densely Connected Convolutional Networks, and an algorithm for tracking positions of touch and hovering shadows. Our experiments show that the system can distinguish between six touch gestures under three lighting conditions with 87.5 − 96.0% accuracy, depending on the lighting, and can accurately track touch positions as well as infer motion activities in realistic interaction conditions. Additional applications for this method include interactive screens on inflatable robots and privacy-maintaining robots for the home.

[1]  Mari Velonaki,et al.  Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin , 2014, Int. J. Soc. Robotics.

[2]  Mari Velonaki,et al.  Touch modality interpretation for an EIT-based sensitive skin , 2011, 2011 IEEE International Conference on Robotics and Automation.

[3]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[4]  Edward H. Adelson,et al.  Improved GelSight tactile sensor for measuring geometry and slip , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[5]  Alex Waibel,et al.  Tracking Human Faces in Real-Time, , 1995 .

[6]  Kilian Q. Weinberger,et al.  Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Isabella Huang,et al.  A Depth Camera-Based Soft Fingertip Device for Contact Region Estimation and Perception-Action Coupling , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[8]  David Rye,et al.  Improved EIT drive patterns for a robotics sensitive skin , 2009 .

[9]  Jun Rekimoto,et al.  Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces , 2010, ITS '10.

[10]  Jonathan Rossiter,et al.  The TacTip Family: Soft Optical Tactile Sensors with 3D-Printed Biomimetic Morphologies , 2018, Soft robotics.

[11]  Li-Jia Li,et al.  Multi-view Face Detection Using Deep Convolutional Neural Networks , 2015, ICMR.

[12]  Tetsuo Ono,et al.  Robovie: an interactive humanoid robot , 2001 .

[13]  José Carlos Castillo,et al.  Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning , 2017, Sensors.

[14]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[15]  Dan Xu,et al.  Real-time dynamic gesture recognition system based on depth perception for robot navigation , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[16]  Huseyin Atakan Varol,et al.  Color-Coded Fiber-Optic Tactile Sensor for an Elastomeric Robot Skin , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[17]  Alessandro Albini,et al.  Human hand recognition from robotic skin measurements in human-robot physical interactions , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[18]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[19]  Katsu Yamane,et al.  Design and fabrication of a soft robotic hand and arm system , 2018, 2018 IEEE International Conference on Soft Robotics (RoboSoft).

[20]  Patrick Olivier,et al.  Building Interactive Multi-touch Surfaces , 2010, Tabletops.

[21]  John Paul Shen,et al.  SurfaceVibe: Vibration-Based Tap & Swipe Tracking on Ubiquitous Surfaces , 2017, 2017 16th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN).

[22]  Derrick J. Parkhurst,et al.  Enhancing Multi-user Interaction with Multi-touch Tabletop Displays Using Hand Tracking , 2008, First International Conference on Advances in Computer-Human Interaction.

[23]  Keiichi Abe,et al.  Topological structural analysis of digitized binary images by border following , 1985, Comput. Vis. Graph. Image Process..

[24]  Takanori Shibata,et al.  Living With Seal Robots—Its Sociopsychological and Physiological Influences on the Elderly at a Care House , 2007, IEEE Transactions on Robotics.

[25]  Christopher G. Atkeson,et al.  Physical human interaction for an inflatable manipulator , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[26]  Minoru Asada,et al.  CB2: A child robot with biomimetic body for cognitive developmental robotics , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[27]  Francis Quek,et al.  MirrorTrack - a vision based multi-touch system for glossy display surfaces , 2008 .

[28]  Jonathan Rossiter,et al.  TACTIP - Tactile Fingertip Device, Texture Analysis through Optical Tracking of Skin Features , 2013, Living Machines.

[29]  Cynthia Breazeal,et al.  Designing a socially assistive robot for pediatric care , 2015, IDC.

[30]  Kaspar Althoefer,et al.  Tactile image based contact shape recognition using neural network , 2012, 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI).

[31]  Karon E. MacLean,et al.  Recognizing affect in human touch of a robot , 2015, Pattern Recognit. Lett..

[32]  Cynthia Breazeal,et al.  Affective Touch for Robotic Companions , 2005, ACII.

[33]  Edward H. Adelson,et al.  Microgeometry capture using an elastomeric sensor , 2011, ACM Trans. Graph..

[34]  E. Adelson,et al.  Retrographic sensing for the measurement of surface texture and shape , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[35]  Trevor Darrell,et al.  Transfer learning for image classification with sparse prototype representations , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[36]  Julien Letessier,et al.  Visual tracking of bare fingers for interactive surfaces , 2004, UIST '04.

[37]  Edward H. Adelson,et al.  Localization and manipulation of small parts using GelSight tactile sensing , 2014, IROS.

[38]  T. Shibata,et al.  Ubiquitous surface tactile sensor , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[39]  Patrick Chiu,et al.  Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN , 2018, ISS.

[40]  James L. Crowley,et al.  Active hand tracking , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[41]  R. Barber,et al.  Maggie: A Robotic Platform for Human-Robot Social Interaction , 2006, 2006 IEEE Conference on Robotics, Automation and Mechatronics.

[42]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[43]  Cristina Nuzzi,et al.  Deep learning-based hand gesture recognition for collaborative robots , 2019, IEEE Instrumentation & Measurement Magazine.

[44]  Chao Yang,et al.  A Survey on Deep Transfer Learning , 2018, ICANN.

[45]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[46]  Matthew J. Hertenstein,et al.  The Communicative Functions of Touch in Humans, Nonhuman Primates, and Rats: A Review and Synthesis of the Empirical Research , 2006, Genetic, social, and general psychology monographs.