A Survey on Media Interaction in Social Robotics

Social robots have attracted increasing research interests in academic and industry communities. The emerging media technologies greatly inspired human-robot interaction approaches, which aimed to tackle important challenges in practical applications. This paper presents a survey of recent works on media interaction in social robotics. We first introduce the state-of-the-art social robots and the related concepts. Then, we review the visual interaction approaches through various human actions such as facial expression, hand gesture and body motion, which have been widely considered as effective media interaction ways with robots. Furthermore, we summarize the event detection approaches which are crucial for robots to understand the environment and human intentions. While the emphasis is on vision-based interaction approaches, the multimodal interaction works are also briefly summarized for practitioners.

[1]  Anthony G. Pipe,et al.  Joint action understanding improves robot-to-human object handover , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[3]  Danilo De Rossi,et al.  Designing and Evaluating a Social Gaze-Control System for a Humanoid Robot , 2014, IEEE Transactions on Human-Machine Systems.

[4]  Luc Van Gool,et al.  Real-time 3D hand gesture interaction with a robot for understanding directions from humans , 2011, 2011 RO-MAN.

[5]  Siddhartha S. Srinivasa,et al.  Toward seamless human-robot handovers , 2013, Journal of Human-Robot Interaction.

[6]  Jindong Liu,et al.  An Asynchronous RGB-D Sensor Fusion Framework Using Monte-Carlo Methods for Hand Tracking on a Mobile Robot in Crowded Environments , 2013, ICSR.

[7]  Carlos Hitoshi Morimoto,et al.  Improving Head Movement Tolerance of Cross-Ratio Based Eye Trackers , 2012, International Journal of Computer Vision.

[8]  Adrián Romero-Garcés,et al.  Audio-Visual Perception System for a Humanoid Robotic Head , 2014, Sensors.

[9]  Cynthia Breazeal,et al.  Design of a therapeutic robotic companion for relational, affective touch , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[10]  Qionghai Dai,et al.  A data-driven approach for facial expression synthesis in video , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Antonis A. Argyros,et al.  Gesture Recognition Supporting the Interaction of Humans with Socially Assistive Robots , 2014, ISVC.

[12]  Jorge Dias,et al.  Attentional Mechanisms for Socially Interactive Robots–A Survey , 2014, IEEE Transactions on Autonomous Mental Development.

[13]  Marcelo H. Ang,et al.  Why Robots? A Survey on the Roles and Benefits of Social Robots in the Therapy of Children with Autism , 2013, International Journal of Social Robotics.

[14]  Gwen Littlewort,et al.  Towards Social Robots: Automatic Evaluation of Human-robot Interaction by Face Detection and Expression Classification , 2003, NIPS.

[15]  Caetano M. Ranieri,et al.  Assisted robot navigation based on speech recognition and synthesis , 2014, 5th ISSNIP-IEEE Biosignals and Biorobotics Conference (2014): Biosignals and Robotics for Better and Safer Living (BRC).

[16]  Brenan J. McCarragher,et al.  Modeling and constraining human interactions in shared control utilizing a discrete event framework , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[17]  Mari Velonaki,et al.  Robotics and Autonomous Systems , 2014 .

[18]  Mari Velonaki,et al.  Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin , 2014, Int. J. Soc. Robotics.

[19]  Lu Yang,et al.  Face Recognition in the Wild by Mining Frequent Feature Itemset , 2014, CCPR.

[20]  Aude Billard,et al.  A survey of Tactile Human-Robot Interactions , 2010, Robotics Auton. Syst..

[21]  Christian Martyn Jones,et al.  Affective Human-Robotic Interaction , 2008, Affect and Emotion in Human-Computer Interaction.

[22]  Sven Wachsmuth,et al.  ART-based fusion of multi-modal perception for robots , 2013, Neurocomputing.

[23]  Andrew B. Williams,et al.  Computational Awareness in a Tactile-Responsive Humanoid Robot Comedian , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[24]  Ian D. Walker,et al.  Use of kinect depth data and Growing Neural Gas for gesture based robot control , 2012, 2012 6th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops.

[25]  Marcelo H. Ang,et al.  A Survey on Perception Methods for Human–Robot Interaction in Social Robots , 2013, International Journal of Social Robotics.

[26]  Zhe Zhang,et al.  Human Body Pose Interpretation and Classification for Social Human-Robot Interaction , 2011, Int. J. Soc. Robotics.

[27]  Kenichi Ogawa,et al.  Honda humanoid robots development , 2007, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[28]  Michael Karg,et al.  Acquisition and use of transferable, spatio-temporal plan representations for human-robot interaction , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[29]  Lu Yang,et al.  Kernelized pyramid nearest-neighbor search for object categorization , 2014, Machine Vision and Applications.

[30]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[31]  Zhijun Zhang,et al.  Human–Robot Interaction by Understanding Upper Body Gestures , 2014, PRESENCE: Teleoperators and Virtual Environments.

[32]  Cristina P. Santos,et al.  Facial Expressions and Gestures to Convey Emotions with a Humanoid Robot , 2013, ICSR.

[33]  Marie Tahon,et al.  Inference of Human Beings’ Emotional States from Speech in Human–Robot Interactions , 2015, Int. J. Soc. Robotics.

[34]  Rainer Stiefelhagen,et al.  Visual recognition of pointing gestures for human-robot interaction , 2007, Image Vis. Comput..

[35]  Seong-Whan Lee,et al.  Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter , 2011, Image Vis. Comput..

[36]  Horst-Michael Groß,et al.  User-Centered Design and Evaluation of a Mobile Shopping Robot , 2015, Int. J. Soc. Robotics.

[37]  Gabriele Trovato,et al.  Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols , 2013, Int. J. Soc. Robotics.

[38]  Javier R. Movellan,et al.  Home Alone: Social Robots for Digital Ethnography of Toddler Behavior , 2013, 2013 IEEE International Conference on Computer Vision Workshops.

[39]  Antonis A. Argyros,et al.  Hobbit , a care robot supporting independent living at home : First prototype and lessons learned , 2015 .

[40]  Katherine M. Tsui,et al.  Designing speech-based interfaces for telepresence robots for people with disabilities , 2013, 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR).

[41]  Rares Ambrus,et al.  Modeling motion patterns of dynamic objects by IOHMM , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[42]  Matteo Menna,et al.  Learning the Dynamic Process of Inhibition and Task Switching in Robotics Cognitive Control , 2013, 2013 12th International Conference on Machine Learning and Applications.

[43]  Ming Xie,et al.  Finger identification and hand posture recognition for human-robot interaction , 2007, Image Vis. Comput..

[44]  Geoffrey A. Hollinger,et al.  HERB: a home exploring robotic butler , 2010, Auton. Robots.

[45]  Fuchun Sun,et al.  Semi-supervised particle filter for visual tracking , 2009, 2009 IEEE International Conference on Robotics and Automation.

[46]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[47]  Paul Baxter,et al.  Cognitive architecture for human–robot interaction: Towards behavioural alignment , 2013, BICA 2013.

[48]  João Sequeira,et al.  A Multimodal Emotion Detection System during Human-Robot Interaction , 2013, Sensors.

[49]  Haizhou Li,et al.  Making Social Robots More Attractive: The Effects of Voice Pitch, Humor and Empathy , 2013, Int. J. Soc. Robotics.

[50]  Fernando De la Torre,et al.  Max-Margin Early Event Detectors , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[51]  Oliver Lemon,et al.  Evaluating a social multi-user interaction model using a Nao robot , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[52]  Siddhartha S. Srinivasa,et al.  Formalizing Assistive Teleoperation , 2012, Robotics: Science and Systems.

[53]  Jaap Ham,et al.  Representing Affective Facial Expressions for Robots and Embodied Conversational Agents by Facial Landmarks , 2013, Int. J. Soc. Robotics.

[54]  Jun-Ho Oh,et al.  Mechanical design of humanoid robot platform KHR-3 (KAIST Humanoid Robot 3: HUBO) , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[55]  Emer Gilmartin,et al.  Herme, Yet Another Interactive Conversational Robot , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[56]  Jekaterina Novikova,et al.  Towards Artificial Emotions to Assist Social Coordination in HRI , 2015, Int. J. Soc. Robotics.

[57]  Hong Cheng,et al.  A windowed dynamic time warping approach for 3D continuous hand gesture recognition , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[58]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[59]  Maria E. Jabon,et al.  Facial expression analysis for predicting unsafe driving behavior , 2011, IEEE Pervasive Computing.