Can Using Pointing Gestures Encourage Children to Ask Questions?

Even though asking questions is fundamental for self-motivated learning, children often have difficulty verbalizing them. Hence, we hypothesized that a robot’s capability to perceive pointing gestures will encourage children to ask more questions. We experimentally tested this hypothesis with the Wizard-of-Oz technique with 92 elementary-school students who interacted with our robot in a situation where it served as a guide who explains a museum exhibit. The children asked the robot significantly more questions when it could perceive pointing gestures than when it lacked such a capability. We also discuss the possibility of implementing autonomous robots based on the findings of our Wizard-of-Oz approach.

[1]  A. Dittmann,et al.  Body movement and speech rhythm in social conversation. , 1969, Journal of personality and social psychology.

[2]  G. Hewes Primate Communication and the Gestural Origin of Language , 1992, Current Anthropology.

[3]  M. Alibali,et al.  Transitions in concept acquisition: using the hand to read the mind. , 1993, Psychological review.

[4]  Arne Jönsson,et al.  Wizard of Oz studies: why and how , 1993, IUI '93.

[5]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[6]  R. Krauss,et al.  PSYCHOLOGICAL SCIENCE Research Article GESTURE, SPEECH, AND LEXICAL ACCESS: The Role of Lexical Movements in Speech Production , 2022 .

[7]  B. McCombs,et al.  The Learner-Centered Classroom and School: Strategies for Increasing Student Motivation and Achievement. The Jossey-Bass Education Series. , 1997 .

[8]  R. Krauss Why Do We Gesture When We Speak? , 1998 .

[9]  M. Alibali,et al.  The function of gesture in learning to count: more than keeping track * , 1999 .

[10]  Wolfram Burgard,et al.  MINERVA: a second-generation museum tour-guide robot , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[11]  Benjamin S. Bloom,et al.  A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives , 2000 .

[12]  Hideaki Kuzuoka,et al.  GestureMan: a mobile robot that embodies a remote instructor's actions , 2000, CSCW '00.

[13]  Martin J. Russell,et al.  Why is automatic recognition of children's speech difficult? , 2001, INTERSPEECH.

[14]  Brian Scassellati,et al.  Investigating models of social development using a humanoid robot , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[15]  Susan Goldin Hearing gesture : how our hands help us think , 2003 .

[16]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Martha W. Alibali,et al.  Gesture in Spatial Cognition: Expressing, Communicating, and Thinking About Spatial Information , 2005, Spatial Cogn. Comput..

[18]  Sungho Kim,et al.  The educational use of home robots for children , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[19]  K. Pine,et al.  The effects of prohibiting gestures on children's lexical retrieval ability. , 2007, Developmental science.

[20]  Kerstin Dautenhahn,et al.  Methodology & Themes of Human-Robot Interaction: A Growing Research Field , 2007 .

[21]  Takayuki Kanda,et al.  Natural deictic communication with humanoid robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Takayuki Kanda,et al.  Providing route directions: Design of robot's utterance, gesture, and timing , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Takayuki Kanda,et al.  Pointing to space: modeling of deictic interaction referring to regions , 2010, HRI 2010.

[24]  Gernot A. Fink,et al.  Saliency-based identification and recognition of pointed-at objects , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  Pengcheng Luo,et al.  Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Christoph Bartneck,et al.  Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor , 2010, CHI.

[27]  Luc Van Gool,et al.  Real-time 3D hand gesture interaction with a robot for understanding directions from humans , 2011, 2011 RO-MAN.

[28]  Takayuki Kanda,et al.  Field Trial of a Networked Robot at a Train Station , 2011, Int. J. Soc. Robotics.

[29]  Jörg Stückler,et al.  Towards joint attention for a domestic service robot - person awareness and gesture recognition using Time-of-Flight cameras , 2011, 2011 IEEE International Conference on Robotics and Automation.

[30]  Martina A. Rau,et al.  Representing Space: Exploring the Relationship between Gesturing and Geoscience Understanding in Children , 2012, Spatial Cognition.

[31]  S. Goldin-Meadow,et al.  Learning what children know about space from looking at their hands: the added value of gesture in spatial communication. , 2012, Journal of experimental child psychology.

[32]  Takayuki Kanda,et al.  Person Tracking in Large Public Spaces Using 3-D Range Sensors , 2013, IEEE Transactions on Human-Machine Systems.

[33]  Bilge Mutlu,et al.  Modeling and Evaluating Narrative Gestures for Humanlike Robots , 2013, Robotics: Science and Systems.

[34]  Hideaki Kuzuoka,et al.  An Ethnomethodological Study of a Museum Guide Robot's Attempt at Engagement and Disengagement , 2014, J. Robotics.

[35]  Carolyn Penstein Rosé,et al.  Effects of Social Presence and Social Role on Help-Seeking and Learning , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[36]  Luca Maria Gambardella,et al.  Human-swarm interaction using spatial gestures , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[37]  Rainer Stiefelhagen,et al.  “Look at this!” learning to guide visual saliency in human-robot interaction , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[38]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[39]  Vanessa Evers,et al.  Robot gestures make difficult tasks easier: the impact of gestures on perceived workload and task performance , 2014, CHI.

[40]  H. Christensen,et al.  Did you Mean this Object?: Detecting Ambiguity in Pointing Gesture Targets , 2015 .

[41]  U. Leonards,et al.  Iconic Gestures for Robot Avatars, Recognition and Integration with Speech , 2016, Front. Psychol..

[42]  A. Bangerter,et al.  Flexible Coordination of Stationary and Mobile Conversations with Gaze: Resource Allocation among Multiple Joint Activities , 2016, Front. Psychol..