Affective and Behavioral Assessment for Adaptive Intelligent Tutoring Systems

Adaptive Intelligent Tutoring Systems (ITS) aim at helping students going through the resolution of a given problem in a principled way according to the desired outcomes, the intrinsic capabilities of the student, and the particular context in which the exercise takes place. These systems should be capable of acting according to mistakes, boredness, distractions, etc. Several works propose different models to represent the problem being solved, the student solving it and the tutor guidance to the desired solution. The system complexity requires non trivial models whose corresponding parameters need to be estimated with differents kinds of data, usually requiring heavy and difficult sensing and recognition tasks. In this work, we present some of the work in progress in the BIG-AFF project. Between other issues, we deal with the use of low cost and low intrusive devices to gather contextual data to losely drive the actions of an ITS without constructing a fully structured student model with corresponding affective and behavioral states. The idea is to improve the students’ learning outcome and satisfaction by progressively learning how to adapt the ITS in terms of the sensed data.

[1]  Alejandro Rodríguez-Ascaso,et al.  Challenges for Inclusive Affective Detection in Educational Scenarios , 2013, HCI.

[2]  Arthur C. Graesser,et al.  Multimethod assessment of affective experience and expression during deep learning , 2009, Int. J. Learn. Technol..

[3]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[4]  Mary Ainley,et al.  Connecting with Learning: Motivation, Affect and Cognition in Interest Processes , 2006 .

[5]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[6]  Peter L. Søndergaard,et al.  Efficient Algorithms for the Discrete Gabor Transform with a Long Fir Window , 2012 .

[7]  Ioannis Pitas,et al.  Facial Expression Recognition in Image Sequences Using Geometric Deformation Features and Support Vector Machines , 2007, IEEE Transactions on Image Processing.

[8]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[9]  M Congedo,et al.  A review of classification algorithms for EEG-based brain–computer interfaces , 2007, Journal of neural engineering.

[10]  M. Kostyunina,et al.  Frequency characteristics of EEG spectra in the emotions , 1996, Neuroscience and Behavioral Physiology.

[11]  Omar AlZoubi,et al.  Classification of EEG for Affect Recognition: An Adaptive Approach , 2009, Australasian Conference on Artificial Intelligence.

[12]  Jesus G. Boticario,et al.  MAMIPEC - Affective Modeling in Inclusive Personalized Educational Scenarios , 2012 .

[13]  Sergio Salmeron-Majadas,et al.  An Evaluation of Mouse and Keyboard Interaction Indicators towards Non-intrusive and Low Cost Affective Modeling in an Educational Context , 2014, KES.

[14]  Sergio Salmeron-Majadas,et al.  Filtering of Spontaneous and Low Intensity Emotions in Educational Contexts , 2015, AIED.

[15]  David Arnau,et al.  Intensive scaffolding in an intelligent tutoring system for the learning of algebraic word problem solving , 2015, Br. J. Educ. Technol..

[16]  Peter Robinson,et al.  Generalization of a Vision-Based Computational Model of Mind-Reading , 2005, ACII.

[17]  Olga C. Santos,et al.  Emotions and Personality in Adaptive e-Learning Systems: An Affective Computing Perspective , 2017, Emotions and Personality in Personalized Services.

[18]  Sergio Salmeron-Majadas,et al.  Emotions Detection from Math Exercises by Combining Several Data Sources , 2013, AIED.

[19]  Jesus G. Boticario,et al.  An Open Sensing and Acting Platform for Context-Aware Affective Support in Ambient Intelligent Educational Settings , 2016, IEEE Sensors Journal.

[20]  David Arnau,et al.  Domain-specific knowledge representation and inference engine for an intelligent tutoring system , 2013, Knowl. Based Syst..

[21]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[22]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[24]  Arthur C. Graesser,et al.  Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features , 2010, User Modeling and User-Adapted Interaction.

[25]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[26]  Qiang Ji,et al.  A Unified Probabilistic Framework for Spontaneous Facial Action Modeling and Understanding , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  T. Strohmer,et al.  Gabor Analysis and Algorithms: Theory and Applications , 1997 .

[28]  Olga C. Santos,et al.  Practical guidelines for designing and evaluating educationally oriented recommendations , 2015, Comput. Educ..

[29]  David Arnau,et al.  Fundamentals of the design and the operation of an intelligent tutoring system for the learning of the arithmetical and algebraic way of solving word problems , 2013, Comput. Educ..

[30]  Tieniu Tan,et al.  Affective Computing and Intelligent Interaction, First International Conference, ACII 2005, Beijing, China, October 22-24, 2005, Proceedings , 2005, ACII.

[31]  Miguel Arevalillo-Herráez,et al.  A Linear Cost Algorithm to Compute the Discrete Gabor Transform , 2010, IEEE Transactions on Signal Processing.

[32]  Olga C Santos,et al.  Involving Users to Improve the Collaborative Logical Framework , 2014, TheScientificWorldJournal.

[33]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[34]  Francesc J. Ferri,et al.  Cognitive reasoning and inferences through psychologically based personalised modelling of emotions using associative classifiers , 2014, 2014 IEEE 13th International Conference on Cognitive Informatics and Cognitive Computing.

[35]  David Arnau,et al.  Emulating Human Supervision in an Intelligent Tutoring System for Arithmetical Problem Solving , 2014, IEEE Transactions on Learning Technologies.

[36]  Sergio Salmeron-Majadas,et al.  A Methodological Approach to Eliciting Affective Educational Recommendations , 2014, 2014 IEEE 14th International Conference on Advanced Learning Technologies.

[37]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[38]  Xiaodong Li,et al.  AI 2009: Advances in Artificial Intelligence, 22nd Australasian Joint Conference, Melbourne, Australia, December 1-4, 2009. Proceedings , 2009, Australasian Conference on Artificial Intelligence.

[39]  Gwen Littlewort,et al.  Fully Automatic Facial Action Recognition in Spontaneous Behavior , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[40]  Sergio Salmeron-Majadas,et al.  Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches , 2014, TheScientificWorldJournal.

[41]  Nicu Sebe,et al.  Multimodal approaches for emotion recognition: a survey , 2005, IS&T/SPIE Electronic Imaging.

[42]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[43]  Miguel Arevalillo-Herráez,et al.  Eigenexpressions: Emotion Recognition Using Multiple Eigenspaces , 2013, IbPRIA.