Gentle Versus Strong Touch Classification: Preliminary Results, Challenges, and Potentials

Touch plays a crucial role in humans’ nonverbal social and affective communication. It then comes as no surprise to observe a considerable effort that has been placed on devising methodologies for automated touch classification. For instance, such an ability allows for the use of smart touch sensors in such real-life application domains as socially-assistive robots and embodied telecommunication. In fact, touch classification literature represents an undeniably progressive result. However, these results are limited in two important ways. First, they are mostly based on overall (i.e., average) accuracy of different classifiers. As a result, they fall short in providing an insight on performance of these approaches as per different types of touch. Second, they do not consider the same type of touch with different level of strength (e.g., gentle versus strong touch). This is certainly an important factor that deserves investigating since the intensity of a touch can utterly transform its meaning (e.g., from an affectionate gesture to a sign of punishment). The current study provides a preliminary investigation of these shortcomings by considering the accuracy of a number of classifiers for both, within- (i.e., same type of touch with differing strengths) and between-touch (i.e., different types of touch) classifications. Our results help verify the strength and shortcoming of different machine learning algorithms for touch classification. They also highlight some of the challenges whose solution concepts can pave the path for integration of touch sensors in such application domains as human–robot interaction (HRI).

[1]  Kazuhiko Shinozawa,et al.  Recognizing human touching behaviors using a haptic interface for a pet-robot , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).

[2]  Yuichi Itoh,et al.  Emoballoon: A balloon-shaped interface recognizing social touch interactions , 2013, 2013 IEEE Virtual Reality (VR).

[3]  T. Field Touch for Socioemotional and Physical Well-Being: A Review. , 2010 .

[4]  Maja J. Matarić,et al.  Socially assistive robotics: Human augmentation versus automation , 2017, Science Robotics.

[5]  W. K. Simmons,et al.  Circular analysis in systems neuroscience: the dangers of double dipping , 2009, Nature Neuroscience.

[6]  Matt Jones,et al.  TapBack: towards richer mobile interfaces in impoverished contexts , 2011, CHI.

[7]  Oguz Bayat,et al.  A Systematic Mapping Study on Touch Classification , 2018 .

[8]  Meritxell Valentí Soler,et al.  Social robots in advanced dementia , 2015, Front. Aging Neurosci..

[9]  Vijay Kumar,et al.  The grand challenges of Science Robotics , 2018, Science Robotics.

[10]  Cynthia Breazeal,et al.  Affective Touch for Robotic Companions , 2005, ACII.

[11]  Gilles Louppe,et al.  Independent consultant , 2013 .

[12]  Roderick Murray-Smith,et al.  Stane: synthesized surfaces for tactile input , 2008, CHI.

[13]  Dong-Soo Kwon,et al.  A robust online touch pattern recognition for dynamic human-robot interaction , 2010, IEEE Transactions on Consumer Electronics.

[14]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[15]  S. Shyam Sundar,et al.  Can you hold my hand? Physical warmth in human-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Matthew J. Hertenstein,et al.  The Communicative Functions of Touch in Humans, Nonhuman Primates, and Rats: A Review and Synthesis of the Empirical Research , 2006, Genetic, social, and general psychology monographs.

[17]  Karon E. MacLean,et al.  Recognizing affect in human touch of a robot , 2015, Pattern Recognit. Lett..

[18]  R. Rosenthal,et al.  Meta-analysis: recent developments in quantitative methods for literature reviews. , 2001, Annual review of psychology.

[19]  Dirk Heylen,et al.  Touching the Void -- Introducing CoST: Corpus of Social Touch , 2014, ICMI.

[20]  Karon E. MacLean,et al.  Gesture Recognition in the Haptic Creature , 2010, EuroHaptics.

[21]  Mannes Poel,et al.  A Neural Network Based Approach to Social Touch Classification , 2014, ERM4HCI '14.

[22]  Mari Velonaki,et al.  Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin , 2012, Int. J. Robotics Res..

[23]  Maciej Tomczak,et al.  The need to report effect size estimates revisited. An overview of some recommended measures of effect size , 2014 .

[24]  Arjan Kuijper,et al.  Acoustic tracking of hand activities on surfaces , 2015, iWOAR.

[25]  João Sequeira,et al.  A Multimodal Emotion Detection System during Human-Robot Interaction , 2013, Sensors.

[26]  Brian Scassellati,et al.  Improving social skills in children with ASD using a long-term, in-home social robot , 2018, Science Robotics.