A Robot for Reconstructing Presentation Behavior in Lecture

In universities, lecturers often use presentation slides to present their lecture contents with non-verbal behavior involving paralanguage, gaze and gesture, which is so important for promoting learners' understanding. However, it is not simple for the lecturers to appropriately control non-verbal behavior depending on the lecture contents. This paper proposes a lecture robot, which substitutes for lecturers, and which reconstructs their non-verbal behavior conducted in their lecture. Towards such reconstruction, we have designed a model of presentation behavior in lecture. This paper also demonstrates a lecture robot system, which detects insufficient/inappropriate behavior to reconstruct and reproduce by following the presentation behavior model. In addition, this paper reports a case study using the system with 31 participants, whose purpose was to compare the video-recorded lecture conducted by a lecturer and the lecture reconstructed by the robot system. The results indicate that gaze, face direction, and pointing gesture for keeping and controlling learners' attention reconstructed by the lecture robot are more acceptable and understandable. It is also suggested that the robot could promote their concentration by means of eye contact with them, and that it has a potential to promote their understanding of the lecture contents.

[1]  Tatsuo Arai,et al.  Nonverbal behaviors toward an audience and a screen for a presentation by a humanoid robot , 2014, Artif. Intell. Res..

[2]  Bilge Mutlu,et al.  A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[3]  Scotty D. Craig,et al.  Student Perceptions: The Test of Spatial Contiguity and Gestures for Robot Instructors , 2017, HRI.

[4]  Tomomasa Sato,et al.  Analysis of Impression of Robot Bodily Expression , 2002, J. Robotics Mechatronics.

[5]  Bruno Siciliano Dance notations and robot motion , 2016 .

[6]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[7]  Pengcheng Luo,et al.  Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Mitsuru Ishizuka,et al.  Humanoid Robot Presentation through Multimodal Presentation Markup Language MPML-HR , 2005 .

[9]  Mitsuhiro Goto,et al.  Understanding Presentation Document with Visualization of Connections between Presentation Slides , 2016, KES.

[10]  Bilge Mutlu,et al.  Multivariate evaluation of interactive robot systems , 2014, Auton. Robots.

[11]  J. Collins,et al.  Education techniques for lifelong learning: giving a PowerPoint presentation: the art of communicating effectively. , 2004, Radiographics : a review publication of the Radiological Society of North America, Inc.

[12]  Toyoaki Nishida,et al.  Converting Text into Agent Animations: Assigning Gestures to Text , 2004, HLT-NAACL.

[13]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Willem J. M. Levelt,et al.  Gesture and the communicative intention of the speaker , 2005 .

[15]  Catherine Pelachaud,et al.  Generating Co-speech Gestures for the Humanoid Robot NAO through BML , 2011, Gesture Workshop.

[16]  Takahiro Tanaka,et al.  Driver Agent for Encouraging Safe Driving Behavior for the Elderly , 2017, HAI.

[17]  Brian Scassellati,et al.  Robot nonverbal behavior improves task performance in difficult collaborations , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).