Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning

An important aspect in Human–Robot Interaction is responding to different kinds of touch stimuli. To date, several technologies have been explored to determine how a touch is perceived by a social robot, usually placing a large number of sensors throughout the robot’s shell. In this work, we introduce a novel approach, where the audio acquired from contact microphones located in the robot’s shell is processed using machine learning techniques to distinguish between different types of touches. The system is able to determine when the robot is touched (touch detection), and to ascertain the kind of touch performed among a set of possibilities: stroke, tap, slap, and tickle (touch classification). This proposal is cost-effective since just a few microphones are able to cover the whole robot’s shell since a single microphone is enough to cover each solid part of the robot. Besides, it is easy to install and configure as it just requires a contact surface to attach the microphone to the robot’s shell and plug it into the robot’s computer. Results show the high accuracy scores in touch gesture recognition. The testing phase revealed that Logistic Model Trees achieved the best performance, with an F-score of 0.81. The dataset was built with information from 25 participants performing a total of 1981 touch gestures.

[1]  Fernando Alonso-Martín,et al.  Maggie: A Social Robot as a Gaming Platform , 2011, Int. J. Soc. Robotics.

[2]  S. Shyam Sundar,et al.  Can you hold my hand? Physical warmth in human-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Buntarou Shizuki,et al.  Touch & activate: adding interactivity to existing objects using active acoustic sensing , 2013, UIST.

[4]  Arjan Kuijper,et al.  Acoustic tracking of hand activities on surfaces , 2015, iWOAR.

[5]  Matthew J. Hertenstein,et al.  The Communicative Functions of Touch in Humans, Nonhuman Primates, and Rats: A Review and Synthesis of the Empirical Research , 2006, Genetic, social, and general psychology monographs.

[6]  Hiroshi Ishiguro,et al.  Robovie-IV: A Communication Robot Interacting with People Daily in an Office , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Kevin Leyton-Brown,et al.  Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms , 2012, KDD.

[8]  Minoru Asada,et al.  CB2: A child robot with biomimetic body for cognitive developmental robotics , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[9]  Ian H. Witten,et al.  WEKA: a machine learning workbench , 1994, Proceedings of ANZIIS '94 - Australian New Zealnd Intelligent Information Systems Conference.

[10]  Shigeki Sugano,et al.  Development of human symbiotic robot: WENDY , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[11]  Ashutosh Kumar Singh,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2010 .

[12]  Shuichi Nishio,et al.  Recognizing affection for a touch-based interaction with a humanoid robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Ah Chung Tsoi,et al.  Face recognition: a convolutional neural-network approach , 1997, IEEE Trans. Neural Networks.

[14]  Mari Velonaki,et al.  Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin , 2014, Int. J. Soc. Robotics.

[15]  Mari Velonaki,et al.  Touch modality interpretation for an EIT-based sensitive skin , 2011, 2011 IEEE International Conference on Robotics and Automation.

[16]  J. Platt Sequential Minimal Optimization : A Fast Algorithm for Training Support Vector Machines , 1998 .

[17]  Aude Billard,et al.  A survey of Tactile Human-Robot Interactions , 2010, Robotics Auton. Syst..

[18]  Matt Jones,et al.  TapBack: towards richer mobile interfaces in impoverished contexts , 2011, CHI.

[19]  Senén Barro,et al.  Do we need hundreds of classifiers to solve real world classification problems? , 2014, J. Mach. Learn. Res..

[20]  Radomir S. Stankovic,et al.  The Haar wavelet transform: its status and achievements , 2003, Comput. Electr. Eng..

[21]  Robin I. M. Dunbar,et al.  Topography of social touching depends on emotional bonds between humans , 2015, Proceedings of the National Academy of Sciences.

[22]  R. Barber,et al.  Maggie: A Robotic Platform for Human-Robot Social Interaction , 2006, 2006 IEEE Conference on Robotics, Automation and Mechatronics.

[23]  Roderick Murray-Smith,et al.  Stane: synthesized surfaces for tactile input , 2008, CHI.

[24]  Alan R. Jones,et al.  Fast Fourier Transform , 1970, SIGP.

[25]  T. Jones,et al.  Interpersonal distance, body orientation, and touch: effects of culture, gender, and age. , 1995, The Journal of social psychology.

[26]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[27]  Pedro Lopes,et al.  Augmenting touch interaction through acoustic sensing , 2011, ITS '11.

[28]  Wan Ling Chang,et al.  PARO robot affects diverse interaction modalities in group sensory therapy for older adults with dementia , 2013, 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR).

[29]  Francisco José da Silva e Silva,et al.  An Experimental Evaluation of Data Mining Algorithms Using Hyperparameter Optimization , 2015, 2015 Fourteenth Mexican International Conference on Artificial Intelligence (MICAI).

[30]  Karon E. MacLean,et al.  Recognizing affect in human touch of a robot , 2015, Pattern Recognit. Lett..

[31]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[32]  Chris Harrison,et al.  Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces , 2008, UIST '08.

[33]  João Sequeira,et al.  A Multimodal Emotion Detection System during Human-Robot Interaction , 2013, Sensors.

[34]  Rich Caruana,et al.  An empirical evaluation of supervised learning in high dimensions , 2008, ICML '08.

[35]  Giulio Sandini,et al.  Tactile Sensing—From Humans to Humanoids , 2010, IEEE Transactions on Robotics.

[36]  Joseph A. Paradiso,et al.  Passive acoustic sensing for tracking knocks atop large interactive displays , 2002, Proceedings of IEEE Sensors.

[37]  Taghi M. Khoshgoftaar,et al.  Deep learning applications and challenges in big data analytics , 2015, Journal of Big Data.

[38]  Gilles R. Ducharme,et al.  Computational Statistics and Data Analysis a Similarity Measure to Assess the Stability of Classification Trees , 2022 .

[39]  L. Breiman The Little Bootstrap and other Methods for Dimensionality Selection in Regression: X-Fixed Prediction Error , 1992 .

[40]  Bram Vanderborght,et al.  Mechanical Design of the huggable Robot Probo , 2011, Int. J. Humanoid Robotics.

[41]  Fernando Alonso-Martín,et al.  Using a Social Robot as a Gaming Platform , 2010, ICSR.

[42]  Mark H. Lee,et al.  A Survey of Robot Tactile Sensing Technology , 1989, Int. J. Robotics Res..

[43]  Yasuo Kuniyoshi,et al.  Conformable and scalable tactile sensor skin for curved surfaces , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[44]  Karon E. MacLean,et al.  The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature , 2012, Int. J. Soc. Robotics.

[45]  Dimitri Palaz,et al.  Analysis of CNN-based speech recognition system using raw speech as input , 2015, INTERSPEECH.

[46]  Fernando Alonso-Martín,et al.  Multidomain Voice Activity Detection during Human-Robot Interaction , 2013, ICSR.