The Effect of Embodiment in Sign Language Tutoring with Assistive Humanoid Robots

This paper presents interactive games for sign language tutoring assisted by humanoid robots. The games are specially designed for children with communication impairments. In this study, different robot platforms such as a Nao H25 and a Robovie R3 humanoid robots are used to express a set of chosen signs in Turkish Sign Language using hand and arm movements. Two games involving physically and virtually embodied robots are designed. In the game involving physically embodied robot, the robot is able to communicate with the participant by recognizing colored flashcards through a camera based system and generating a selected subset of signs including motivational facial gestures, in return. A mobile version of the game is also implemented to be used as part of children’s education and therapy for the purpose of teaching signs. The humanoid robot acts as a social peer and assistant in the games to motivate the child, teach a selected set of signs, evaluate the child’s effort, and give appropriate feedback to improve the learning and recognition rate of children. Current paper presents results from the preliminary study with different test groups, where children played with the physical robot platform, R3, and a mobile game incorporating the videos of the robot performing the signs, thus the effect of assistive robot’s embodiment is analyzed within these games. The results indicate that the physical embodiment plays a significant role on improving the children’s performance, engagement and motivation.

[1]  Changchun Liu,et al.  Online Affect Detection and Robot Behavior Adaptation for Intervention of Children With Autism , 2008, IEEE Transactions on Robotics.

[2]  Sang Ryong Kim,et al.  Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human-robot interaction , 2006, Int. J. Hum. Comput. Stud..

[3]  Lev Vygotsky Mind in society , 1978 .

[4]  Ho-Sub Yoon,et al.  Visual Processing of Rock, Scissors, Paper Game for Human Robot Interaction , 2006, 2006 SICE-ICASE International Joint Conference.

[5]  Hatice Kose-Bagci,et al.  Robostar: An interaction game with humanoid robots for learning sign language , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).

[6]  J. Bruner Acts of meaning , 1990 .

[7]  Kerstin Dautenhahn,et al.  Robots as social mediators for children with Autism - A preliminary analysis comparing two different robotic platforms , 2011, 2011 IEEE International Conference on Development and Learning (ICDL).

[8]  Jamy Li,et al.  The benefit of being physically present: A survey of experimental works comparing copresent robots, telepresent robots and virtual agents , 2015, Int. J. Hum. Comput. Stud..

[9]  Lale Akarun,et al.  STARS: Sign tracking and recognition system using input-output HMMs , 2009, Pattern Recognit. Lett..

[10]  S. Powell,et al.  Helping Children with Autism to Learn , 2000 .

[11]  Lale Akarun,et al.  A multi-class classification strategy for Fisher scores: Application to signer independent sign language recognition , 2010, Pattern Recognit..

[12]  P. Hakkarainen Perspectives on activity theory: Play and motivation , 1999 .

[13]  Jean Piaget,et al.  Part I: Cognitive development in children: Piaget development and learning , 1964 .

[14]  Hatice Kose-Bagci,et al.  Evaluation of the Robot Assisted Sign Language Tutoring Using Video-Based Studies , 2012, International Journal of Social Robotics.

[15]  R. Mayberry Cognitive development in deaf children: the interface of language and perception in neuropsychology , 2002 .

[16]  Hatice Kose-Bagci,et al.  iSign: An Architecture for Humanoid Assisted Sign Language Tutoring , 2015, Intelligent Assistive Robots.

[17]  Ana Paiva,et al.  iCat, the chess player: the influence of embodiment in the enjoyment of a game , 2008, AAMAS.

[18]  Kerstin Fischer,et al.  Levels of embodiment: Linguistic analyses of factors influencing HRI , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Andrea Lockerd Thomaz,et al.  Simon plays Simon says: The timing of turn-taking in an imitation game , 2011, 2011 RO-MAN.

[20]  Jordan Grafman,et al.  Handbook of Neuropsychology , 1991 .

[21]  Hatice Kose-Bagci,et al.  Ispy-usign humanoid assisted interactive sign language tutoring games , 2013, 2013 IEEE RO-MAN.

[22]  Hatice Kose-Bagci,et al.  Non-verbal communication with a social robot peer: Towards robot assisted interactive sign language tutoring , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[23]  Kerstin Dautenhahn,et al.  From embodied to socially embedded agents – Implications for interaction-aware robots , 2002, Cognitive Systems Research.

[24]  Christoph Bartneck,et al.  Interacting with an embodied emotional character , 2003, DPPI '03.

[25]  Hatice Kose-Bagci,et al.  Socially Interactive Robotic Platforms as Sign Language Tutors , 2014, Int. J. Humanoid Robotics.

[26]  Hatice Kose-Bagci,et al.  Tale of a robot: Humanoid robot assisted sign language tutoring , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[27]  Pamela J. Hinds,et al.  Whose job is it anyway? a study of human-robot interaction in a collaborative task , 2004 .

[28]  Matt Huenerfauth,et al.  A Multi-Path Architecture for Machine Translation of English Text into American Sign Language Animation , 2004, NAACL.

[29]  Takanori Komatsu,et al.  Comparing an On-Screen Agent with a Robotic Agent in Non-Face-to-Face Interactions , 2008, IVA.

[30]  Maja J. Mataric,et al.  The role of physical embodiment in human-robot interaction , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[31]  Anna-Lisa Vollmer,et al.  Does embodiment affect tutoring behavior , 2010 .

[32]  H. H. Clark,et al.  References in Conversation Between Experts and Novices , 1987 .

[33]  Alexis Héloir,et al.  Sign Language Avatars: Animation and Comprehensibility , 2011, IVA.

[34]  B. Parton Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence. , 2005, Journal of deaf studies and deaf education.

[35]  Futoshi Naya,et al.  Differences in effect of robot and screen agent recommendations on human decision-making , 2005, Int. J. Hum. Comput. Stud..

[36]  J. K. Salisbury,et al.  Kinematic and Force Analysis of Articulated Mechanical Hands , 1983 .

[37]  Songül Albayrak,et al.  Real Time Isolated Turkish Sign Language Recognition from Video Using Hidden Markov Models with Global Features , 2005, ISCIS.

[38]  M. Terauchi,et al.  Execution and description of dexterous hand task by using multi-finger dual robot hand system - realization of Japanese sign language , 2002, Proceedings of the IEEE Internatinal Symposium on Intelligent Control.

[39]  Chrystopher L. Nehaniv,et al.  Effects of Embodiment and Gestures on Social Interaction in Drumming Games with a Humanoid Robot , 2009, Adv. Robotics.