Uncertainty in Robot Assisted Second Language Conversation Practice

Moments of uncertainty are common for learners when practicing a second language. The appropriate management of these events could help avoid the development of frustration and benefit the learner's experience. Therefore, its detection is crucial in language practice conversations. In this study, an experimental conversation between an adult second language learner and a social robot is employed to visually characterize the learners' uncertainty. The robot's output is manipulated in prosody and lexical levels to provoke uncertainty during the conversation. These reactions are then processed to obtain Facial Action Units (AUs) and Gaze features. Preliminary results show distinctive behavioral patterns of uncertainty among the participants. Based on these results, a new annotation scheme is proposed, which will expand the data used to train sequential models to detect uncertainty. As future steps, the robotic conversational partner will use this information to adapt its behavior in dialogue generation and language complexity.

[1]  J. May,et al.  Eye movement indices of mental workload. , 1990, Acta psychologica.

[2]  Emiel Krahmer,et al.  How Children and Adults Produce and Perceive Uncertainty in Audiovisual Speech , 2005, Language and speech.

[3]  Sidney K. D'Mello,et al.  It's Written on Your Face: Detecting Affective States from Facial Expressions while Learning Computer Programming , 2014, Intelligent Tutoring Systems.

[4]  E. Schegloff,et al.  The preference for self-correction in the organization of repair in conversation , 1977 .

[5]  Loyola Heights,et al.  The Relationships Between Sequences of Affective States and Learner Achievement , 2010 .

[6]  A. Graesser,et al.  Confusion can be beneficial for learning. , 2014 .

[7]  Brandon G. King,et al.  Facial Features for Affective State Detection in Learning Environments , 2007 .

[8]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[9]  Arthur C. Graesser,et al.  Inducing and Tracking Confusion with Contradictions during Critical Thinking and Scientific Reasoning , 2011, AIED.

[10]  Eric Postma,et al.  Automatic detection of confusion in elderly users of a web-based health instruction video. , 2015, Telemedicine journal and e-health : the official journal of the American Telemedicine Association.

[11]  A. Graesser,et al.  Dynamics of affective states during complex learning , 2012 .

[12]  Olov Engwall,et al.  Robot Interaction Styles for Conversation Practice in Second Language Learning , 2021, Int. J. Soc. Robotics.

[13]  Xiaofei Lu The Relationship of Lexical Richness to the Quality of ESL Learners' Oral Narratives. , 2012 .

[14]  Emiel Krahmer,et al.  Children’s Expression of Uncertainty in Collaborative and Competitive Contexts , 2014, AVSP.

[15]  Arthur C. Graesser,et al.  Question asking and eye tracking during cognitive disequilibrium: Comprehending illustrated texts on devices when the devices break down , 2005, Memory & cognition.

[16]  Sanne H. G. van der Ven,et al.  Social Robots for Language Learning: A Review , 2018, Review of Educational Research.

[17]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[18]  Emiel Krahmer,et al.  Using non-verbal cues to (automatically) assess children's performance difficulties with arithmetic problems , 2013, Comput. Hum. Behav..

[19]  Lori Lockyer,et al.  Inside Out , 2017 .