Artificial Intelligence in Education

Recent years have seen a growing recognition of the central role of affect and motivation in learning. In particular, nonverbal behaviors such as posture and gesture provide key channels signaling affective and motivational states. Developing a clear understanding of these mechanisms will inform the development of personalized learning environments that promote successful affective and motivational outcomes. This paper investigates posture and gesture in computer-mediated tutorial dialogue using automated techniques to track posture and hand-to-face gestures. Annotated dialogue transcripts were analyzed to identify the relationships between student posture, student gesture, and tutor and student dialogue. The results indicate that posture and hand-to-face gestures are significantly associated with particular tutorial dialogue moves. Additionally, two-hands-to-face gestures occurred significantly more frequently among students with low self-efficacy. The results shed light on the cognitiveaffective mechanisms that underlie these nonverbal behaviors. Collectively, the findings provide insight into the interdependencies among tutorial dialogue, posture, and gesture, revealing a new avenue for automated tracking of embodied affect during learning.

[1]  Beverly Park Woolf,et al.  Affect-aware tutors: recognising and responding to student affect , 2009, Int. J. Learn. Technol..

[2]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[3]  P. Robinson,et al.  The emotional hearing aid: an assistive tool for children with Asperger syndrome , 2005, Universal Access in the Information Society.

[4]  Kristy Elizabeth Boyer,et al.  Multimodal analysis of the implicit affective channel in computer-mediated textual communication , 2012, ICMI '12.

[5]  A. Bandura GUIDE FOR CONSTRUCTING SELF-EFFICACY SCALES , 2006 .

[6]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[7]  Rick Dale,et al.  Disequilibrium in the mind, disharmony in the body , 2012, Cognition & emotion.

[8]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Ma. Mercedes T. Rodrigo,et al.  COMPARING LEARNERS ’ AFFECT WHILE USING AN INTELLIGENT TUTOR AND AN EDUCATIONAL GAME , 2011 .

[10]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Kristy Elizabeth Boyer,et al.  Combining Verbal and Nonverbal Features to Overcome the “Information Gap” in Task-Oriented Dialogue , 2012, SIGDIAL Conference.

[12]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[13]  Keith W. Brawner,et al.  Real-Time Monitoring of ECG and GSR Signals during Computer-Based Training , 2012, ITS.

[14]  Daniel McDuff,et al.  Real-time inference of mental states from facial expressions and upper body gestures , 2011, Face and Gesture 2011.

[15]  Vincent Aleven,et al.  Towards Sensor-Free Affect Detection in Cognitive Tutor Algebra. , 2012, EDM 2012.

[16]  Diane J. Litman,et al.  Benefits and challenges of real-time uncertainty detection and adaptation in a spoken dialogue computer tutor , 2011, Speech Commun..