Early Prediction of Visitor Engagement in Science Museums with Multimodal Learning Analytics

Modeling visitor engagement is a key challenge in informal learning environments, such as museums and science centers. Devising predictive models of visitor engagement that accurately forecast salient features of visitor behavior, such as dwell time, holds significant potential for enabling adaptive learning environments and visitor analytics for museums and science centers. In this paper, we introduce a multimodal early prediction approach to modeling visitor engagement with interactive science museum exhibits. We utilize multimodal sensor data including eye gaze, facial expression, posture, and interaction log data captured during visitor interactions with an interactive museum exhibit for environmental science education, to induce predictive models of visitor dwell time. We investigate machine learning techniques (random forest, support vector machine, Lasso regression, gradient boosting trees, and multi-layer perceptron) to induce multimodal predictive models of visitor engagement with data from 85 museum visitors. Results from a series of ablation experiments suggest that incorporating additional modalities into predictive models of visitor engagement improves model accuracy. In addition, the models show improved predictive performance over time, demonstrating that increasingly accurate predictions of visitor dwell time can be achieved as more evidence becomes available from visitor interactions with interactive science museum exhibits. These findings highlight the efficacy of multimodal data for modeling museum exhibit visitor engagement.

[1]  Paulo Blikstein,et al.  Multimodal learning analytics , 2013, LAK '13.

[2]  Kevin Crowley,et al.  Flexible Interventions to Increase Family Engagement at Natural History Museum Dioramas , 2016 .

[3]  Po-Yao Chao,et al.  Improving early prediction of academic failure using sentiment analysis on self-evaluated comments , 2018, J. Comput. Assist. Learn..

[4]  Tsvi Kuflik,et al.  Automatic Detection of Social Behavior of Museum Visitor Pairs , 2014, ACM Trans. Interact. Intell. Syst..

[5]  Pierre Dillenbourg,et al.  Sleepers' lag - study on motion and attention , 2014, LAK.

[6]  Michelle Taub,et al.  The agency effect: The impact of student agency on learning, emotions, and problem-solving behaviors in a game-based learning environment , 2020, Comput. Educ..

[7]  Kshitij Sharma,et al.  Predicting learners' effortful behaviour in adaptive assessment using multimodal data , 2020, LAK.

[8]  Jonathan P. Rowe,et al.  Detecting and Addressing Frustration in a Serious Game for Military Training , 2017, International Journal of Artificial Intelligence in Education.

[9]  Kristy Elizabeth Boyer,et al.  Multimodal Goal Recognition in Open-World Digital Games , 2021, AIIDE.

[10]  Marcelo Worsley,et al.  Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex learning tasks , 2016, J. Learn. Anal..

[11]  Kristy Elizabeth Boyer,et al.  Analyzing Posture and Affect in Task-Oriented Tutoring , 2012, FLAIRS Conference.

[12]  Brittany Anderson,et al.  Learning in Museums , 2015 .

[13]  Tamás D. Gedeon,et al.  EmotiW 2018: Audio-Video, Student Engagement and Group-Level Affect Prediction , 2018, ICMI.

[14]  Ulla Richardson,et al.  Print-specific multimodal brain activation in kindergarten improves prediction of reading skills in second grade , 2011, NeuroImage.

[15]  Lynda Kelly A Review of “Practical Evaluation Guide: Tools for Museums and Other Informal Educational Settings” , 2010 .

[16]  Berry Eggen,et al.  The TA Framework: Designing Real-time Teaching Augmentation for K-12 Classrooms , 2020, CHI.

[17]  Kshitij Sharma,et al.  Building pipelines for educational data using AI and multimodal analytics: A "grey-box" approach , 2019, Br. J. Educ. Technol..

[18]  Jonathan P. Rowe,et al.  Investigating Visitor Engagement in Interactive Science Museum Exhibits with Multimodal Bayesian Hierarchical Models , 2020, AIED.

[19]  Chen Sun,et al.  Modeling Team-level Multimodal Dynamics during Multiparty Collaboration , 2019, ICMI.

[20]  Xavier Ochoa,et al.  Multimodal learning analytics: assessing learners' mental state during the process of learning , 2018, The Handbook of Multimodal-Multisensor Interfaces, Volume 2.

[21]  Marcelo Worsley,et al.  Exploring Behavior Representation for Learning Analytics , 2015, ICMI.

[22]  Pierre Dillenbourg,et al.  Teaching analytics: towards automatic extraction of orchestration graphs using wearable sensors , 2016, LAK.

[23]  Catherine Pelachaud,et al.  The NoXi database: multimodal recordings of mediated novice-expert interactions , 2017, ICMI.

[24]  Jonathan P. Rowe,et al.  Improving Affect Detection in Game-Based Learning with Multimodal Data Fusion , 2020, AIED.

[25]  G. Hein Learning Science in Informal Environments: People, Places, and Pursuits , 2009 .

[26]  Javier R. Movellan,et al.  The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions , 2014, IEEE Transactions on Affective Computing.

[27]  Michael S. Horn,et al.  Fluid Grouping: Quantifying Group Engagement around Interactive Tabletop Exhibits in the Wild , 2015, CHI.

[28]  Ryan Shaun Joazeiro de Baker,et al.  Accuracy vs. Availability Heuristic in Multimodal Affect Detection in the Wild , 2015, ICMI.

[29]  Jonathan P. Rowe,et al.  Play in the Museum: Design and Development of a Game-Based Learning Exhibit for Informal Science Education , 2017, Int. J. Gaming Comput. Mediat. Simulations.

[30]  Rafael A. Calvo,et al.  Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate , 2017, IEEE Transactions on Affective Computing.

[31]  Sharon L. Oviatt,et al.  Ten Opportunities and Challenges for Advancing Student-Centered Multimodal Learning Analytics , 2018, ICMI.

[32]  Dinesh Babu Jayagopi,et al.  Predicting Engagement Intensity in the Wild Using Temporal Convolutional Network , 2018, ICMI.

[33]  Pierre Dillenbourg,et al.  System for assessing classroom attention , 2013, LAK '13.

[34]  Jonathan P. Rowe,et al.  A Generalized Multidimensional Evaluation Framework for Player Goal Recognition , 2016, AIIDE.

[35]  H. Chad Lane,et al.  Intelligent Tutoring Goes to the Museum in the Big City: A Pedagogical Agent for Informal Science Education , 2011, AIED.

[36]  Charles R. Graham,et al.  Learner Engagement in Blended Learning Environments: A Conceptual Framework , 2019, Online Learning.

[37]  Carrick C. Williams,et al.  Eye movements during information processing tasks: Individual differences and cultural effects , 2007, Vision Research.

[38]  Xavier Ochoa,et al.  Presentation Skills Estimation Based on Video and Kinect Data Analysis , 2014, MLA@ICMI.

[39]  Roger Azevedo,et al.  Enhancing Student Models in Game-based Learning with Facial Expression Recognition , 2017, UMAP.

[40]  Bertrand Schneider,et al.  Identifying Collaborative Learning States Using Unsupervised Machine Learning on Eye-Tracking, Physiological and Motion Sensor Data , 2019, EDM.

[41]  Kristy Elizabeth Boyer,et al.  Affect-Based Early Prediction of Player Mental Demand and Engagement for Educational Games , 2018, AIIDE.

[42]  Peter Robinson,et al.  OpenFace: An open source facial behavior analysis toolkit , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[43]  Andrew Olney,et al.  Multimodal Capture of Teacher-Student Interactions for Automated Dialogic Analysis in Live Classrooms , 2015, ICMI.

[44]  Ryan Shaun Joazeiro de Baker,et al.  Detecting Student Emotions in Computer-Enabled Classrooms , 2016, IJCAI.

[45]  Sidney K. D'Mello,et al.  Investigating the Impact of a Real-time, Multimodal Student Engagement Analytics Technology in Authentic Classrooms , 2019, CHI.

[46]  Arthur C. Graesser,et al.  Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features , 2010, User Modeling and User-Adapted Interaction.

[47]  Tom McKlin,et al.  Trajectories of Physical Engagement and Expression in a Co-Creative Museum Installation , 2019, Creativity & Cognition.

[48]  Yang Liu,et al.  An Ensemble Model Using Face and Body Tracking for Engagement Detection , 2018, ICMI.

[49]  Bertrand Schneider,et al.  Exploring Collaboration Using Motion Sensors and Multi-Modal Learning Analytics , 2018, EDM.