Touch Challenge '15: Recognizing Social Touch Gestures

Advances in the field of touch recognition could open up applications for touch-based interaction in areas such as Human-Robot Interaction (HRI). We extended this challenge to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two data sets were made available containing labeled pressure sensor data of social touch gestures that were performed by touching a touch-sensitive surface with the hand. Each set was collected from similar sensor grids, but under conditions reflecting different application orientations: CoST: Corpus of Social Touch and HAART: The Human-Animal Affective Robot Touch gesture set. In this paper we describe the challenge protocol and summarize the results from the touch challenge hosted in conjunction with the 2015 ACM International Conference on Multimodal Interaction (ICMI). The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods used.

[1]  C. Spence,et al.  The science of interpersonal touch: An overview , 2010, Neuroscience & Biobehavioral Reviews.

[2]  Karon E. MacLean,et al.  The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature , 2012, Int. J. Soc. Robotics.

[3]  Karon E. MacLean,et al.  Gesture Recognition in the Haptic Creature , 2010, EuroHaptics.

[4]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Dirk Heylen,et al.  Touching the Void -- Introducing CoST: Corpus of Social Touch , 2014, ICMI.

[6]  Mannes Poel,et al.  A Neural Network Based Approach to Social Touch Classification , 2014, ERM4HCI '14.

[7]  Mari Velonaki,et al.  Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin , 2012, Int. J. Robotics Res..

[8]  Merel M. Jung Towards Social Touch Intelligence: Developing a Robust System for Automatic Touch Recognition , 2014, ICMI.

[9]  Kerem Altun,et al.  Recognizing Touch Gestures for Social Human-Robot Interaction , 2015, ICMI.

[10]  Karon E. MacLean,et al.  Affective touch gesture recognition for a furry zoomorphic machine , 2013, TEI '13.

[11]  Karon E. MacLean,et al.  Recognizing affect in human touch of a robot , 2015, Pattern Recognit. Lett..

[12]  Mari Velonaki,et al.  Robotics and Autonomous Systems , 2014 .

[13]  Wafa Johal,et al.  The Grenoble System for the Social Touch Challenge at ICMI 2015 , 2015, ICMI.

[14]  Shuichi Nishio,et al.  Recognizing affection for a touch-based interaction with a humanoid robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Karon E. MacLean,et al.  Different Strokes and Different Folks: Economical Dynamic Surface Sensing and Affect-Related Touch Recognition , 2015, ICMI.

[16]  Fan Zhang,et al.  Social Touch Gesture Recognition using Random Forest and Boosting on Distinct Feature Sets , 2015, ICMI.

[17]  Nikolaus Correll,et al.  Detecting and Identifying Tactile Gestures using Deep Autoencoders, Geometric Moments and Gesture Level Features , 2015, ICMI.

[18]  Aude Billard,et al.  A survey of Tactile Human-Robot Interactions , 2010, Robotics Auton. Syst..

[19]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..