Impacts of Multimodal Feedback on Efficiency of Proactive Information Retrieval from Task-Related HRI

This work is a first step towards an integration of multimodality with the aim to make efficient use of both human-like, and non-human-like feedback modalities in order to optimize proactive information retrieval from task-related Human-Robot Interaction (HRI) in human environments. The presented approach combines the human-like modalities speech and emotional facialmimicry with non-humanlike modalities. The proposed non-human-like modalities are a screen displaying retrieved knowledge of the robot to the human and a pointer mounted above the robot head for pointing directions and referring to objects in shared visual space as an equivalent for arm and hand gestures. Initially, pre-interaction feedback is explored in an experiment investigating different approach behaviors in order to find socially acceptable trajectories to increase the success of interactions and thus efficiency of information retrieval. Secondly, preevaluated human-like modalities are introduced. First results of a multimodal feedback study are presented in the context of the IURO project1, where a robot asks for its way to a predefined goal location.

[1]  K. Dautenhahn,et al.  Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach , 2006, 9th IEEE International Workshop on Advanced Motion Control, 2006..

[2]  P. Ekman,et al.  Felt, false, and miserable smiles , 1982 .

[3]  Kolja Kühnlenz,et al.  Towards robotic facial mimicry: System development and evaluation , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[4]  Alexander H. Waibel,et al.  Natural human-robot interaction using speech, head pose and gestures , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[5]  Manfred Tscheligi,et al.  Navigating in public space: Participants' evaluation of a robot's approach behavior , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  Reid G. Simmons,et al.  Socially Distributed Perception: GRACE plays social tag at AAAI 2005 , 2007, Auton. Robots.

[7]  Bob Carpenter,et al.  Vector-based Natural Language Call Routing , 1999, Comput. Linguistics.

[8]  Giuseppe Riccardi,et al.  How may I help you? , 1997, Speech Commun..

[9]  Stefan Kopp,et al.  A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction , 2011, 2011 RO-MAN.

[10]  Johan Boye,et al.  Multi-slot semantics for natural-language call routing systems , 2007, HLT-NAACL 2007.

[11]  C. Bartneck,et al.  Measuring the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots , 2008, HRI 2008.

[12]  Thora Tenbrink,et al.  Telling Rolland Where to Go: HRI Dialogues on Route Navigation , 2009, Spatial Language and Dialogue.

[13]  J. Andrade-Cetto,et al.  Ubiquitous networking robotics in urban settings , 2006 .

[14]  Diane Horton,et al.  Repairing conversational misunderstandings and non-understandings , 1994, Speech Communication.

[15]  Manfred Tscheligi,et al.  A case study on the effect of feedback on itinerary requests in human-robot interaction , 2011, 2011 RO-MAN.

[16]  Martin Buss,et al.  Towards a dialog strategy for handling miscommunication in human-robot dialog , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[17]  Guido Bugmann,et al.  Instruction Based Learning: how to instruct a personal robot to find HAL. , 2001 .

[18]  Heinz Wörn,et al.  A novel approach to proactive human-robot cooperation , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[19]  M. Tscheligi,et al.  Robots asking for directions: the willingness of passers-by to support robots , 2010, HRI 2010.

[20]  Martin Buss,et al.  Dialog strategies for handling miscommunication in task-related HRI , 2011, 2011 RO-MAN.

[21]  J. F. Kelley,et al.  An empirical methodology for writing user-friendly natural language computer applications , 1983, CHI '83.

[22]  Kerstin Dautenhahn,et al.  Methodological Issues in HRI: A Comparison of Live and Video-Based Methods in Robot to Human Approach Direction Trials , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[23]  Heinz Wörn,et al.  Proactive Robot Task Selection Given a Human Intention Estimate , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[24]  Kerstin Dautenhahn,et al.  Robotic etiquette: Results from user studies involving a fetch and carry task , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[25]  Sharon L. Oviatt,et al.  Multimodal Interfaces: A Survey of Principles, Models and Frameworks , 2009, Human Machine Interaction.

[26]  Takayuki Kanda,et al.  How to approach humans?-strategies for social robots to initiate interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[27]  Rachid Alami,et al.  Exploratory Study of a Robot Approaching a Person in the Context of Handing Over an Object , 2007, AAAI Spring Symposium: Multidisciplinary Collaboration for Socially Assistive Robotics.

[28]  Illah R. Nourbakhsh,et al.  The mobot museum robot installations: a five year experiment , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[29]  Bernd Krieg-Brückner,et al.  Modelling Navigational Knowledge by Route Graphs , 2000, Spatial Cognition.

[30]  Vladimir A. Kulyukin,et al.  On natural language dialogue with assistive robots , 2006, HRI '06.

[31]  Kolja Kühnlenz,et al.  Generating artificial smile variations based on a psychological system-theoretic approach , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[32]  David Lee,et al.  Exploratory studies on social spaces between humans and a mechanical-looking robot , 2006, Connect. Sci..

[33]  Paolo Dario,et al.  DustCart, a Mobile Robot for Urban Environments : Experiments of Pollution Monitoring and Mapping during Autonomous Navigation in Urban Scenarios , 2010, ICRA 2010.

[34]  Vanessa Evers,et al.  Measuring acceptance of an assistive social robot: a suggested toolkit , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[35]  Gabriel Skantze Error Handling in Spoken Dialogue Systems : Managing Uncertainty, Grounding and Miscommunication , 2007 .

[36]  Horst-Michael Groß,et al.  ShopBot: Progress in developing an interactive mobile shopping assistant for everyday use , 2008, 2008 IEEE International Conference on Systems, Man and Cybernetics.

[37]  Olesya Govorun,et al.  An inkblot for attitudes: affect misattribution as implicit measurement. , 2005, Journal of personality and social psychology.

[38]  Glenda M. Fisk,et al.  Is “service with a smile” enough? Authenticity of positive displays during service encounters , 2005 .

[39]  Kolja Kühnlenz,et al.  Improving aspects of empathy and subjective performance for HRI through mirroring facial expressions , 2011, RO-MAN.

[40]  Martin Buss,et al.  Design and Evaluation of Emotion-Display EDDIE , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[41]  Manfred Tscheligi,et al.  A communication structure for human-robot itinerary requests , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[42]  Rachid Alami,et al.  How may I serve you?: a robot companion approaching a seated person in a helping context , 2006, HRI '06.

[43]  Wolfram Burgard,et al.  MINERVA: a second-generation museum tour-guide robot , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).