Different Strokes and Different Folks: Economical Dynamic Surface Sensing and Affect-Related Touch Recognition

Social touch is an essential non-verbal channel whose great interactive potential can be realized by the ability to recognize gestures performed on inviting surfaces. To assess impact on recognition performance of sensor motion, substrate and coverings, we collected gesture data from a low-cost multitouch fabric pressure-location sensor while varying these factors. For six gestures most relevant in a haptic social robot context plus a no-touch control, we conducted two studies, with the sensor (1) stationary, varying substrate and cover (n=10); and (2) attached to a robot under a fur covering, flexing or stationary (n=16). For a stationary sensor, a random forest model achieved 90.0% recognition accuracy (chance 14.2%) when trained on all data, but as high as 94.6% (mean 89.1%) when trained on the same individual. A curved, flexing surface achieved 79.4% overall but averaged 85.7% when trained and tested on the same individual. These results suggest that under realistic conditions, recognition with this type of flexible sensor is sufficient for many applications of interactive social touch. We further found evidence that users exhibit an idiosyncratic `touch signature', with potential to identify the toucher. Both findings enable varied contexts of affective or functional touch communication, from physically interactive robots to any touch-sensitive object.

[1]  Merel M. Jung Towards Social Touch Intelligence: Developing a Robust System for Automatic Touch Recognition , 2014, ICMI.

[2]  Mark R. Cutkosky,et al.  A robust, low-cost and low-noise artificial skin for human-friendly robots , 2010, 2010 IEEE International Conference on Robotics and Automation.

[3]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[4]  T. Shibata,et al.  Engaging with artificial pets , 2005 .

[5]  Karon E. MacLean,et al.  Gestures for industry Intuitive human-robot communication from human observation , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[7]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  K. Wada,et al.  How Effective Is Robot Therapy?: PARO and People with Dementia , 2011 .

[9]  A. Gawlinski,et al.  Animal-assisted therapy in patients hospitalized with heart failure. , 2007, American journal of critical care : an official publication, American Association of Critical-Care Nurses.

[10]  D. Keltner,et al.  The communication of emotion via touch. , 2009, Emotion.

[11]  Karon E. MacLean,et al.  Design and Evaluation of a Touch-Centered Calming Interaction with a Social Robot , 2016, IEEE Transactions on Affective Computing.

[12]  K. B. Shimoga,et al.  Finger Force and Touch Feedback Issues in Dexterous Telemanipulation , 1992, Proceedings. Fourth Annual Conference on Intelligent Robotic Systems for Space Exploration.

[13]  Dirk Heylen,et al.  Touching the Void -- Introducing CoST: Corpus of Social Touch , 2014, ICMI.

[14]  Kazuyoshi Wada,et al.  Development and preliminary evaluation of a caregiver's manual for robot therapy using the therapeutic seal robot Paro , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[15]  T. Shibata,et al.  Robot Therapy: A New Approach for Mental Healthcare of the Elderly – A Mini-Review , 2010, Gerontology.

[16]  Jacob Cohen,et al.  A power primer. , 1992, Psychological bulletin.

[17]  Karon E. MacLean,et al.  The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature , 2012, Int. J. Soc. Robotics.

[18]  D. Keltner,et al.  Touch communicates distinct emotions. , 2006, Emotion.

[19]  Mari Velonaki,et al.  Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin , 2014, Int. J. Soc. Robotics.

[20]  Mari Velonaki,et al.  Touch modality interpretation for an EIT-based sensitive skin , 2011, 2011 IEEE International Conference on Robotics and Automation.

[21]  Aude Billard,et al.  A survey of Tactile Human-Robot Interactions , 2010, Robotics Auton. Syst..

[22]  T. Tamura,et al.  Is an entertainment robot useful in the care of elderly people with severe dementia? , 2004, The journals of gerontology. Series A, Biological sciences and medical sciences.

[23]  Karon E. MacLean,et al.  Affective touch gesture recognition for a furry zoomorphic machine , 2013, TEI '13.

[24]  W. Banks,et al.  Animal-assisted therapy and loneliness in nursing homes: use of robotic versus living dogs. , 2008, Journal of the American Medical Directors Association.

[25]  Fulvio Mastrogiovanni,et al.  Developing skin-based technologies for interactive robots - challenges in design, development and the possible integration in therapeutic environments , 2011, RO-MAN.

[26]  P. Good,et al.  Permutation Tests: A Practical Guide to Resampling Methods for Testing Hypotheses , 1995 .

[27]  Jelle Saldien,et al.  On the Design of the Huggable Robot Probo , 2008 .

[28]  Karon E. MacLean,et al.  Personal Space Invaders: Exploring Robot-initated Touch-based Gestures for Collaborative Robotics , 2015, HRI.

[29]  Ron Kohavi,et al.  A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.

[30]  Karon E. MacLean,et al.  Gesture Recognition in the Haptic Creature , 2010, EuroHaptics.

[31]  Ellen Yi-Luen Do,et al.  Tactile Hand Gesture Recognition through Haptic Feedback for Affective Online Communication , 2011, HCI.