The Potential of Affective Computing in E-Learning : MYSELF project experience

The importance of emotions in learning process is more and more acknowledged. This paper presents preliminary work carried out within the EUfunded project MYSELF. The project aims at developing an e-learning platform endowed with affective computing capabilities for the training of relational skills through interactive simulations. Three main issues are at the moment under investigation: the implementation of a virtual tutor provided with emotional expressive synthesis abilities; a multimodal emotional detection module able to get information on user’s state along the learning path; the development of 3D interactive simulations and targeted exercises to improve emotional management, with specific focus on expression and recognition of emotions. Affective computing in Human-Computer Interaction can be defined as “computing that relates to, arises from, or deliberately influences emotion” [1]. A growing amount of studies support the claim that affect plays a critical role in decision-making and learning performance as it influences cognitive processes [2,3]. For example, as suggested by Goleman [4] “the extent to which emotional upsets can interfere with mental life is no news to teachers. Students who are anxious, angry or depressed don’t learn; people who are caught in these states do not take information efficiently or deal with it very well”. Despite the relationship between learning and emotions is far from being that simple and linear, it is now recognized that positive and negative affect states trigger different kind of thinking and this might hold important implications from educational and training perspective. Within the research community, there is more and more awareness that a consistent theory of learning that integrates effectively cognitive and emotional factors is strongly needed [5]. The basic observation is that a range of emotions occurs naturally in a real learning process, from positive ones (joy, satisfaction, elation, etc. for example in case of successful achievement), to negative ones (frustration, sadness, confusion as a consequence for example of failure and lack of understanding), to emotions more related to interest, curiosity and surprise in front of a new topic. Models such as the one of Kort and Reilly [6], for example, identified a cyclic model attempting to interweave the emotional dimension with the cognitive dynamics of the learning process. The acknowledgment of the user’s affective state might play an important role in improving the effectiveness of e-learning. The emotional unawareness has been considered one of the main limits of the traditional e-learning tools (especially the ones where learning takes place mostly individually). In fact, while skilled teachers can modify the learning path and their teaching style according to the feedback signals provided by the learners (which include cognitive, emotional and motivational aspects), e-learning platforms cannot generally take account of these feedbacks resulting often too rigid and weakened. The Learning Companion project, at MIT [7], was one of the first experiences trying to address this issue in distance learning and computer-assisted training for children. A number of projects are currently being conducted in order to design e-learning platforms endowed with affective computing capabilities, although very few or no commercial results are currently available. One of these projects, funded by the European Commission and started in September 2004, is called MYSELF “Multimodal elearning System based on Simulations, Role-Playing, Automatic Coaching and Voice Recognition interaction for Affective Profiling” (www.myself-proj.it) and involves 14 partners (among Universities, research labs, educational and IT companies and SMEs) from 6 different EU countries [8]. The project aims at expanding the potential of e-learning through learning by doing (experiential and active learning), affective profiling and multimodal humanmachine interaction. In order to reach these goals, one of the tools to be designed and developed consists in a web-based platform with affective computing capabilities for individual and collaborative learning simulations. This will be a flexible and reusable tool on which it could be possible to implement different target simulations. The focus of Myself platform and simulations will be on training social and relational skills in different professional settings (banking, commerce, health-care, etc.). As far as affective computing features are specifically concerned, three main issues are at the moment being investigated. First of all, the design and implementation of a 3D virtual tutor provided with emotional expressive synthesis abilities. Research on human-like agents and Embodied Conversational Agents [9] showed that anthropomorphism is not a benefit in itself, unless it is coupled with adequate expressive, conversational and interactive abilities. We designed the tutor LINDA (Learning INtelligent Dynamic Agent) (see Fig. 1), a 3D model developed with Poser 5 and animated according to results of a preliminary study focused on micro-analysis of a real tutor’s facial mimics expressions coded using FACS (Facial Action Coding System) [10]. Special attention was devoted to the multimodality and time synchrony of emotional expression. We are currently testing the effectiveness of Linda’s emotional expressiveness and its implications for impression formation in the user throughout the learning experience. This work will be followed by design of conversational strategies and modeling of the interaction with the user. Fig. 1. A virtual tutor with emotional expressiveness: LINDA agent. Second issue under investigation is a multimodal emotional recognition system able to provide to the platform information about the emotional and motivational state of the user; much work has been now carried out in the affective computing domain to perform the detection and inference of emotional state from physiological correlates [11,12], facial expressions [13,14], vocal-non-verbal features (such as F0, intensity, etc.) [15,16], verbal speech content, questionnaires or self-report measures and the detection of behavioural events (e.g. mouse-clicking) [17]. We will focus on those emotional states which are important the training process (relevant emotions such as: interest, curiosity, frustration, satisfaction, enjoyment, tiredness etc.). At the moment the work is focusing on building a multimodal database [18] as a background for training and testing algorithms and decision systems. We are also trying to evaluate feasibility of implementation of different channels/modalities in a prospective elearning applications, according to several key criteria such as technical feasibility, costs, reliability, intrusiveness. This system will be coupled by a cognitive architecture modelling affect allowing to consistently personalize the learning path according to the user’s affective profile and to provide coherent feedback to changes of motivational and affective states of the user during the training experience. Finally, the project aims at the development of 3D interactive simulations and targeted exercises to improve emotional management in interpersonal relationships, with specific focus on expression and recognition of emotions. Emotional competence is mainly learnt through experience throughout our life and plays a central role in our personal and professional lives; therefore, the use of interactive simulations will hopefully provide a controlled experiential setting to foster its training. Emotional and communication skills, typically considered as belonging to face-to-face settings, will thus also become an explicit content of training delivered through an e-learning plat-

[1]  P. Ekman,et al.  Facial Action Coding System: Manual , 1978 .

[2]  D. Goleman Emotional Intelligence: Why It Can Matter More Than IQ , 1995 .

[3]  Rosalind W. Picard Affective Computing , 1997 .

[4]  Tsutomu Miyasato,et al.  Multimodal human emotion/expression recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[5]  S. Paradiso The Emotional Brain: The Mysterious Underpinnings of Emotional Life , 1998 .

[6]  J. Cassell,et al.  Embodied conversational agents , 2000 .

[7]  Rosalind W. Picard,et al.  An affective model of interplay between emotions and learning: reengineering educational pedagogy-building a learning companion , 2001, Proceedings IEEE International Conference on Advanced Learning Technologies.

[8]  E. M. Kinard,et al.  Perceived and actual academic competence in maltreated children. , 2001, Child abuse & neglect.

[9]  Jonathan Klein,et al.  Frustrating the user on purpose: a step toward building an affective computer , 2002, Interact. Comput..

[10]  Rob Reilly,et al.  Analytical Models of Emotions , Learning and Relationships : Towards an Affect-sensitive Cognitive Machine , 2001 .

[11]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[12]  Pierre-Yves Oudeyer,et al.  The production and recognition of emotions in speech: features and algorithms , 2003, Int. J. Hum. Comput. Stud..

[13]  Elmar Nöth,et al.  How to find trouble in communication , 2003, Speech Commun..

[14]  Cynthia Breazeal,et al.  Affective Learning — A Manifesto , 2004 .

[15]  Yuri Ivanov,et al.  Probabilistic combination of multiple modalities to detect interest , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[16]  Alessandro Sacchi,et al.  A Multimodal Database as a Background for Emotional Synthesis, Recognition and Training in E-Learning Systems , 2005, ACII.