Multimodal data capabilities for learning: What can multimodal data tell us about learning?
暂无分享,去创建一个
[1] Kilseop Ryu,et al. Evaluation of mental workload with a combined measure based on physiological indices during a dual task of tracking and mental arithmetic , 2005 .
[2] Wolfgang Rosenstiel,et al. Using touchscreen interaction data to predict cognitive workload , 2016, ICMI.
[3] Arthur C. Graesser,et al. A Time for Emoting: When Affect-Sensitivity Is and Isn't Effective at Promoting Deep Learning , 2010, Intelligent Tutoring Systems.
[4] Omid Noroozi,et al. Multimodal data to design visual learning analytics for understanding regulation of learning , 2019, Comput. Hum. Behav..
[5] Andri Ioannou,et al. Moving Bodies to Moving Minds: A Study of the Use of Motion-Based Games in Special Education , 2018, TechTrends.
[6] Sergio Salmeron-Majadas,et al. A Methodological Approach to Eliciting Affective Educational Recommendations , 2014, 2014 IEEE 14th International Conference on Advanced Learning Technologies.
[7] Patrícia Augustin Jaques,et al. Affective states in computer-supported collaborative learning: Studying the past to drive the future , 2018, Comput. Educ..
[8] Charles E. Hughes,et al. Embodiment analytics of practicing teachers in a virtual immersive environment , 2018, J. Comput. Assist. Learn..
[9] Hendrik Drachsler,et al. JCAL Special Issue on Multimodal Learning Analytics , 2018, J. Comput. Assist. Learn..
[10] M Doppelmayr,et al. Theta synchronization in the human EEG and episodic retrieval , 1998, Neuroscience Letters.
[11] Roberto Martínez Maldonado,et al. Analytics meet patient manikins: challenges in an authentic small-group healthcare simulation classroom , 2017, LAK.
[12] Marcelo Worsley,et al. Multimodal Learning Analytics and Education Data Mining: using computational technologies to measure complex learning tasks , 2016, J. Learn. Anal..
[13] Erik Duval,et al. Learning Analytics for Natural User Interfaces: A Framework, Case Studies and a Maturity Analysis , 2017 .
[14] Linda Corrin,et al. A conceptual framework linking learning design with learning analytics , 2016, LAK.
[15] Hendrik Drachsler,et al. From signals to knowledge: A conceptual model for multimodal learning analytics , 2018, J. Comput. Assist. Learn..
[16] Marlene Scardamalia,et al. Exploring emotional and cognitive dynamics of Knowledge Building in grades 1 and 2 , 2019, User Modeling and User-Adapted Interaction.
[17] Erik Duval,et al. Interactive surfaces and learning analytics: data, orchestration aspects, pedagogical uses and challenges , 2016, LAK.
[18] Arthur C. Graesser,et al. Better to be frustrated than bored: The incidence, persistence, and impact of learners' cognitive-affective states during interactions with three different computer-based learning environments , 2010, Int. J. Hum. Comput. Stud..
[19] Rafael A. Calvo,et al. Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.
[20] Ramón Fabregat,et al. A Software Suite for Efficient Use of the European Qualifications Framework in Online and Blended Courses , 2013, IEEE Transactions on Learning Technologies.
[21] Kshitij Sharma,et al. Multimodal teaching analytics: Automated extraction of orchestration graphs from wearable sensor data , 2018, J. Comput. Assist. Learn..
[22] Nigel Bosch,et al. Automated gaze-based mind wandering detection during computerized learning in classrooms , 2019, User Modeling and User-Adapted Interaction.
[23] Sidney K. D'Mello,et al. A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..
[24] Linden J. Ball,et al. Eye tracking in HCI and usability research. , 2006 .
[25] Rosemary Luckin,et al. The NISPI framework: Analysing collaborative problem-solving from students' physical interactions , 2018, Comput. Educ..
[26] Cristian Cechinel,et al. Exploring Collaborative Writing of User Stories With Multimodal Learning Analytics: A Case Study on a Software Engineering Course , 2018, IEEE Access.
[27] Joshua A. Danish,et al. Using Multimodal Learning Analytics to Model Student Behavior: A Systematic Analysis of Epistemological Framing , 2016, J. Learn. Anal..
[28] Hendrik Drachsler,et al. Read Between the Lines: An Annotation Tool for Multimodal Data for Learning , 2019, LAK.
[29] Vincent Aleven,et al. Temporal analysis of multimodal data to predict collaborative learning outcomes , 2020, Br. J. Educ. Technol..
[30] Daniel L. Schwartz,et al. A time for telling , 1998 .
[31] Michail N. Giannakos,et al. The promise and challenges of multimodal learning analytics , 2020, Br. J. Educ. Technol..
[32] Pierre Dillenbourg,et al. Teaching analytics: towards automatic extraction of orchestration graphs using wearable sensors , 2016, LAK.
[33] Michail N. Giannakos,et al. Learning Analytics for Learning Design: A Systematic Literature Review of Analytics-Driven Design to Enhance Learning , 2019, IEEE Transactions on Learning Technologies.
[34] Ravikiran Vatrapu,et al. Enhancing the Professional Vision of Teachers: A Physiological Study of Teaching Analytics Dashboards of Students' Repertory Grid Exercises in Business Education , 2016, 2016 49th Hawaii International Conference on System Sciences (HICSS).
[35] Michail N. Giannakos,et al. Gaze-Driven Design Insights to Amplify Debugging Skills: A Learner-Centered Analysis Approach , 2018, J. Learn. Anal..
[36] Boban Vesin,et al. Cross-Platform Analytics: A step towards Personalization and Adaptation in Education , 2019, LAK.
[37] Joshua A. Danish,et al. A Measurement Model of Gestures in an Embodied Learning Environment: Accounting for Temporal Dependencies , 2017, J. Learn. Anal..
[38] Lucrezia Crescenzi‐Lanna,et al. Multimodal Learning Analytics research with young children: A systematic review , 2020, Br. J. Educ. Technol..
[39] Marcelo Worsley,et al. Leveraging multimodal learning analytics to differentiate student learning strategies , 2015, LAK.
[40] Ian Oakley,et al. Designing Socially Acceptable Hand-to-Face Input , 2018, UIST.
[41] X. Ochoa,et al. Controlled evaluation of a multimodal system to improve oral presentation skills in a real learning setting , 2020, Br. J. Educ. Technol..
[42] Davinia Hernández-Leo,et al. Data-informed design parameters for adaptive collaborative scripting in across-spaces learning situations , 2019, User Modeling and User-Adapted Interaction.
[43] María Jesús Rodríguez-Triana,et al. A Review of Multimodal Learning Analytics Architectures , 2018, 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT).
[44] Muhterem Dindar,et al. What does physiological synchrony reveal about metacognitive experiences and group performance? , 2020, Br. J. Educ. Technol..
[45] Mutlu Cukurova,et al. Supervised machine learning in multimodal learning analytics for estimating success in project-based learning , 2018, J. Comput. Assist. Learn..
[46] Tuomas Eerola,et al. Generalizability and Simplicity as Criteria in Feature Selection: Application to Mood Classification in Music , 2011, IEEE Transactions on Audio, Speech, and Language Processing.
[47] Arthur C. Graesser,et al. Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features , 2010, User Modeling and User-Adapted Interaction.
[48] Konrad J. Schönborn,et al. Exploring relationships between students' interaction and learning with a haptic virtual biomolecular model , 2011, Comput. Educ..
[49] Rosemary Luckin,et al. Artificial intelligence and multimodal data in the service of human decision-making: A case study in debate tutoring , 2019, Br. J. Educ. Technol..
[50] Bertrand Schneider,et al. The Effect of Highly Scaffolded Versus General Instruction on Students’ Exploratory Behavior and Arousal , 2017, Technol. Knowl. Learn..
[51] María Jesús Rodríguez-Triana,et al. Learning Analytics for Professional and Workplace Learning: A Literature Review , 2017, IEEE Transactions on Learning Technologies.
[52] Kristy Elizabeth Boyer,et al. Classifying student dialogue acts with multimodal learning analytics , 2015, LAK.
[53] Danielle S. McNamara,et al. Learning linkages: Integrating data streams of multiple modalities and timescales , 2018, J. Comput. Assist. Learn..
[54] Jan Cornelis,et al. Multimodal learning analytics to investigate cognitive load during online problem solving , 2020, Br. J. Educ. Technol..
[55] Keeley A. Crockett,et al. Near Real-Time Comprehension Classification with Artificial Neural Networks: Decoding e-Learner Non-Verbal Behavior , 2018, IEEE Transactions on Learning Technologies.
[56] Angela L. Duckworth,et al. Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning , 2017, Educational psychologist.
[57] Kristy Elizabeth Boyer,et al. Unsupervised modeling for understanding MOOC discussion forums: a learning analytics approach , 2015, LAK.
[58] Sanna Järvelä,et al. Sympathetic arousal commonalities and arousal contagion during collaborative learning: How attuned are triad members? , 2019, Comput. Hum. Behav..
[59] Athanasios Vourvopoulos,et al. EEGlass: an EEG-eyeware prototype for ubiquitous brain-computer interaction , 2019, UbiComp/ISWC Adjunct.
[60] Nasrollah Moghaddam Charkari,et al. Multimodal information fusion application to human emotion recognition from face and speech , 2010, Multimedia Tools and Applications.
[61] Alejandro Andrade,et al. Understanding student learning trajectories using multimodal learning analytics within an embodied-interaction learning environment , 2017, LAK.
[62] Marcelo Worsley,et al. A Multimodal Analysis of Making , 2017, International Journal of Artificial Intelligence in Education.
[63] Stefaan Ternier,et al. Learning pulse: a machine learning approach for predicting performance in self-regulated learning using multimodal data , 2017, LAK.
[64] O. Jensen,et al. Frontal theta activity in humans increases with memory load in a working memory task , 2002, The European journal of neuroscience.
[65] Abelardo Pardo,et al. Combining University Student Self-Regulated Learning Indicators and Engagement with Online Learning Events to Predict Academic Performance , 2017, IEEE Transactions on Learning Technologies.
[66] Wiebke Bleidorn,et al. Using Machine Learning to Advance Personality Assessment and Theory , 2019, Personality and social psychology review : an official journal of the Society for Personality and Social Psychology, Inc.
[67] Cristina Conati,et al. Empirically building and evaluating a probabilistic model of user affect , 2009, User Modeling and User-Adapted Interaction.
[68] Kshitij Sharma,et al. Building pipelines for educational data using AI and multimodal analytics: A "grey-box" approach , 2019, Br. J. Educ. Technol..
[69] B. Rienties,et al. Using Temporal Analytics to Detect Inconsistencies Between Learning Design and Students' Behaviours , 2018, J. Learn. Anal..
[70] Manolis Mavrikis,et al. Affective learning: improving engagement and enhancing learning with affect-aware feedback , 2017, User Modeling and User-Adapted Interaction.
[71] Davinia Hernández Leo,et al. Enhancing consent forms to support participant decision making in multimodal learning data research , 2020, Br. J. Educ. Technol..
[72] Paulo Blikstein,et al. Multimodal learning analytics , 2013, LAK '13.
[73] Olga C. Santos,et al. Physical learning analytics: a multimodal perspective , 2018, LAK.
[74] Xavier Ochoa,et al. The RAP system: automatic feedback of oral presentation skills using multimodal analysis and low-cost sensors , 2018, LAK.
[75] Tore Dybå,et al. Empirical studies of agile software development: A systematic review , 2008, Inf. Softw. Technol..
[76] Katerina Avramides,et al. Exploring the interplay between human and machine annotated multimodal learning analytics in hands-on STEM activities , 2016, LAK.
[77] James C. Lester,et al. Multimodal learning analytics for game-based learning , 2020, Br. J. Educ. Technol..
[78] John C. Stamper,et al. A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems , 2018, J. Learn. Anal..
[79] Sudanthi N. R. Wijewickrema,et al. The Importance of Automated Real-Time Performance Feedback in Virtual Reality Temporal Bone Surgery Training , 2019, AIED.
[80] Nicholas D. Duran,et al. Generative Multimodal Models of Nonverbal Synchrony in Close Relationships , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).
[81] Michail N. Giannakos,et al. Multimodal data as a means to understand the learning experience , 2019, Int. J. Inf. Manag..
[82] Manolis Mavrikis,et al. Affect Matters: Exploring the Impact of Feedback During Mathematical Tasks in an Exploratory Environment , 2015, AIED.
[83] Michael J. Junokas,et al. Enhancing multimodal learning through personalized gesture recognition , 2018, J. Comput. Assist. Learn..
[84] Matthew S. Reynolds,et al. Finding Common Ground: A Survey of Capacitive Sensing in Human-Computer Interaction , 2017, CHI.
[85] Edmundas Kazimieras Zavadskas,et al. Affective Tutoring System for Built Environment Management , 2015, Comput. Educ..
[86] Jens Rasmussen,et al. Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models , 1983, IEEE Transactions on Systems, Man, and Cybernetics.
[87] Mykel J. Kochenderfer,et al. Generalizable intention prediction of human drivers at intersections , 2017, 2017 IEEE Intelligent Vehicles Symposium (IV).
[88] A. Graesser,et al. Dynamics of affective states during complex learning , 2012 .
[89] Kevin Casey. Using Keystroke Analytics to Improve Pass-Fail Classifiers , 2017 .
[90] Sidney K. D'Mello,et al. Multimodal Modeling of Coordination and Coregulation Patterns in Speech Rate during Triadic Collaborative Problem Solving , 2018, ICMI.
[91] María Jesús Rodríguez-Triana,et al. The teacher in the loop: customizing multimodal learning analytics for blended learning , 2018, LAK.