Conductive fur sensing for a gesture-aware furry robot

Recent advances in artificial intelligence suggest that machines will soon be capable of communicating in ways previously considered out of their reach. For example, humans engage in sophisticated emotional communication through the language of touch. What technical capabilities would enable computers to do the same? As our group examines this question in the context of emotional touch between a person and a furry social robot, we require sensors designed to detect and recognize subtle, nuanced touches. To this end, we demonstrate a new type of sensor based on conductive fur, which is sensitive to movements unavailable to conventional pressure sensors. The sensor captures motion by measuring changing current as the fur's conductive threads connect and disconnect during touch interaction. We then use machine learning to classify gestures from this time series. An informal evaluation with seven participants found 82% recognition of a 3-gesture set, showing promise for this approach to gesture recognition, and opening a path to emotionally intelligent touch sensing.

[1]  Kerstin Dautenhahn,et al.  I Could Be You: the Phenomenological Dimension of Social Understanding , 1997, Cybern. Syst..

[2]  Hannah Perner-Wilson DIY Wearable Technology , 2009 .

[3]  Batya Friedman,et al.  Hardware companions?: what online AIBO discussion forums reveal about the human-robotic relationship , 2003, CHI '03.

[4]  Steve Yohanan,et al.  The Haptic Creature : social human-robot interaction through affective touch , 2012 .

[5]  Brian R. Duffy,et al.  Anthropomorphism and Robotics , 2022 .

[6]  Robert O. Ambrose,et al.  Tactile gloves for autonomous grasping with the NASA/DARPA Robonaut , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[7]  Karon E. MacLean,et al.  Gesture Recognition in the Haptic Creature , 2010, EuroHaptics.

[8]  Dirk Lefeber,et al.  The Huggable Robot Probo, a Multi-disciplinary Research Platform , 2008, Eurobot Conference.

[9]  C.J.H. Mann,et al.  Proceedings of the Society for the Study of Artificial Intelligence and Simulation of Behaviour – AISB , 2002 .

[10]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[11]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[12]  Clifford Nass,et al.  I am my robot: The impact of robot-building and robot form on operators , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[14]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .

[15]  K. MacLean,et al.  The Haptic Creature Project : Social Human-Robot Interaction through Affective Touch , 2008 .

[16]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[17]  Allison M. Okamura,et al.  Medical and Health-Care Robotics , 2010, IEEE Robotics & Automation Magazine.

[18]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[19]  Karon E. MacLean,et al.  The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Haptic Creature , 2012, Int. J. Soc. Robotics.

[20]  Jian Pei,et al.  A brief survey on sequence classification , 2010, SKDD.

[21]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[22]  Takanori Shibata,et al.  Emotional robot for intelligent system-artificial emotional creature project , 1996, Proceedings 5th IEEE International Workshop on Robot and Human Communication. RO-MAN'96 TSUKUBA.