Assisted robot navigation based on speech recognition and synthesis

Interactive robots can help people with or without disabilities. In this sense, research has been made in order to help children with motor disabilities to explore the world around them, which is important for their cognitive development. However, most of these initiatives lack on natural and intuitive interfaces, or are prohibitively expensive to be adopted in a larger scale. This paper describes an experimental environment to use speech recognition and synthesis to improve human-robot interaction (HRI) with children. The proposed system main goal is to perform activities with physically disabled children, however it can be used with other children. Thus, robots that are attractive, small-sized and relatively low-cost are used to implement such environment. The system recognizes a set of simple speech commands, which allows human-assisted navigation.

[1]  Geoffrey A. Hollinger,et al.  Design of a Social Mobile Robot Using Emotion-Based Decision Mechanisms , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[3]  Sven J. Dickinson,et al.  PLAYBOT A visually-guided robot for physically disabled children , 1998, Image Vis. Comput..

[4]  Vladimir J. Lumelsky,et al.  Final report for the DARPA/NSF interdisciplinary study on human-robot interaction , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[5]  A. Cook,et al.  From Infancy to Early Childhood: The Role of Augmentative Manipulation Robotic Tools in Cognitive and Social Development for Children with Motor Disabilities , 2013 .

[6]  Hideki Shimomura,et al.  Real World Speech Interaction with a Humanoid Robot on a Layered Robot Behavior Control Architecture , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[7]  Brian Scassellati,et al.  How to build robots that make friends and influence people , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[8]  N. Emery,et al.  The eyes have it: the neuroethology, function and evolution of social gaze , 2000, Neuroscience & Biobehavioral Reviews.

[9]  Kerstin Dautenhahn,et al.  Can Social Interaction Skills Be Taught by a Social Agent? The Role of a Robotic Mediator in Autism Therapy , 2001, Cognitive Technology.

[10]  S. Turkle Authenticity in the age of digital companions , 2007 .

[11]  J. Fellous,et al.  Emotions: from brain to robot , 2004, Trends in Cognitive Sciences.

[12]  María Malfaz,et al.  A new architecture for autonomous robots based on emotions , 2004 .