Intuitive multimodal interaction for service robots

Domestic service tasks require three main skills from autonomous robots: robust navigation in indoor environments, flexible object manipulation, and intuitive communication with the users. In this report, we present the communication skills of our anthropomorphic service and communication robots Dynamaid and Robotinho. Both robots are equipped with an intuitive multimodal communication system, including speech synthesis and recognition, gestures and mimic. We evaluate our systems in the @Home league of the RoboCup competitions and in a museum tour guide scenario.

[1]  Sven Behnke,et al.  Integrating vision and speech for conversations with multiple persons , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Sven Behnke,et al.  Feature-based head pose estimation from images , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[3]  Jörg Stückler,et al.  Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications , 2009, ECMR.

[4]  Sven Behnke,et al.  Controlling the gaze direction of a humanoid robot with redundant joints , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[5]  Paul J. W. ten Hagen,et al.  Emotion Disc and Emotion Squares: Tools to Explore the Facial Expression Space , 2003, Comput. Graph. Forum.

[6]  Wolfram Burgard,et al.  Recognizing complex, parameterized gestures from monocular image sequences , 2008, Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots.

[7]  Sven Behnke,et al.  The humanoid museum tour guide Robotinho , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.