A physical learning companion for Mental-Imagery BCI User Training

Mental-Imagery based Brain-Computer Interfaces (MI-BCI) present new opportunities to interact with digital technologies, such as wheelchairs or neuroprostheses, only by performing mental imagery tasks (e.g., imagining an object rotating or imagining hand movements). MI-BCIs can also be used for several applications such as communication or post-stroke rehabilitation. Though, their lack of reliability remains a barrier to a larger scale development of the technology. For example, one task between two is recognized on average 75% of the time. It has been shown that users are more likely to struggle using MI-BCIs if they are non-autonomous or tensed. This might, at least in part, result from a lack of social presence and emotional support, which have yet very little been tested in MI-BCI, despite recommendations from the educational literature. One way to provide such social and emotional context is by using a learning companion. Therefore, we designed, implemented and evaluated the first learning companion dedicated to the improvement of MI-BCI user training. We called this companion PEANUT for Personalized Emotional Agent for Neurotechnology User Training. PEANUT provided social presence and emotional support, depending on the performance and progress of the user, through interventions combining both pronounced sentences and facial expressions. It was designed based on the literature, data analyses and user-studies. We notably conducted various online user surveys to identify the desired characteristics of our learning companion in terms of appearance and supporting speech content. From the results of these surveys we notably deduced which should be the characteristics (personal/non-personal, exclamatory/declarative) of the sentences to be used depending on the performance and progression of a learner. We also found that eyebrows could increase expressiveness of cartoon-like faces. Then, once this companion was implemented, we evaluated it during real online MI-BCI use. We found that non-autonomous people, i.e., who are more inclined to work in a group, that are usually disadvantaged when using MI-BCI were advantaged compared to autonomous people when PEANUT was present with an increase of 3.9% of peak performances. Furthermore, in terms of user experience, PEANUT seems to have improved how people felt about their ability to learn and memorize how to use an MI-BCI by 7.4%, which is a dimension of the user experience we assessed.

[1]  V. Shute Focus on Formative Feedback , 2008 .

[2]  Raymond B. Cattell,et al.  Personality Structure and the New Fifth Edition of the 16PF , 1995 .

[3]  Benjamin Blankertz,et al.  Designing for uncertain, asymmetric control: Interaction design for brain-computer interfaces , 2009, Int. J. Hum. Comput. Stud..

[4]  G. Pfurtscheller,et al.  Optimal spatial filtering of single trial EEG during imagined hand movement. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[5]  Ali A. Ghorbani,et al.  Sentence Subjectivity Analysis in Social Domains , 2013, 2013 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT).

[6]  G. Pfurtscheller,et al.  Brain–Computer Communication: Motivation, Aim, and Impact of Exploring a Virtual Apartment , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[7]  Antonella De Angeli,et al.  Emotions in climbing: a design opportunity for haptic communication , 2016, UbiComp Adjunct.

[8]  Andrea Kübler,et al.  Psychological Factors Influencing Brain-Computer Interface (BCI) Performance , 2015, 2015 IEEE International Conference on Systems, Man, and Cybernetics.

[9]  Kasia Muldner,et al.  Affective Tutors: Automatic Detection of and Response to Student Emotion , 2010, Advances in Intelligent Tutoring Systems.

[10]  James C. Lester,et al.  Modeling and evaluating empathy in embodied companion agents , 2007, Int. J. Hum. Comput. Stud..

[11]  Jérémy Frey,et al.  TOBE: Tangible Out-of-Body Experience , 2015, TEI.

[12]  Thorsten O. Zander,et al.  Detecting affective covert user states with passive brain-computer interfaces , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[13]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[14]  Debra K. Meyer,et al.  Discovering Emotion in Classroom Motivation Research , 2002 .

[15]  Donald A. Norman,et al.  How might people interact with agents , 1994, CACM.

[16]  Fabien Lotte,et al.  Towards Artificial Learning Companions for Mental Imagery-based Brain-Computer Interfaces , 2018, ArXiv.

[17]  James C. Lester,et al.  Evaluating the consequences of affective feedback in intelligent tutoring systems , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[18]  J. Wolpaw,et al.  Brain-Computer Interfaces: Principles and Practice , 2012 .

[19]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[20]  Fabien Lotte,et al.  Why standard brain-computer interface (BCI) training protocols should be changed: an experimental study , 2016, Journal of neural engineering.

[21]  Stephen H. Fairclough,et al.  Fundamentals of physiological computing , 2009, Interact. Comput..

[22]  B G Silverman,et al.  Modeling Emotion and Behavior in Animated Personas to Facilitate Human Behavior Change: The Case of the HEART-SENSE Game , 2001, Health care management science.

[23]  Eva Hornecker,et al.  The role of physicality in tangible and embodied interactions , 2011, INTR.

[24]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[25]  Chi-Jen Lin,et al.  Redefining the learning companion: the past, present, and future of educational agents , 2003, Comput. Educ..

[26]  Eunjoon Rachel Um,et al.  Emotional design in multimedia learning. , 2012 .

[27]  Sriram Subramanian,et al.  Predicting Mental Imagery-Based BCI Performance from Personality, Cognitive Profile and Neurophysiological Patterns , 2015, PloS one.

[28]  Ning Wang,et al.  The politeness effect: Pedagogical agents and learning outcomes , 2008, Int. J. Hum. Comput. Stud..

[29]  Fabien Lotte,et al.  Human Learning for Brain-Computer Interfaces , 2016 .

[30]  Gert Pfurtscheller,et al.  Motor imagery and direct brain-computer communication , 2001, Proc. IEEE.

[31]  R. Pekrun The impact of emotions on learning and achievement : towards a theory of cognitive/motivational mediators , 1992 .

[32]  Tony Belpaeme,et al.  The Robot Who Tried Too Hard: Social Behaviour of a Robot Tutor Can Negatively Affect Child Learning , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[33]  Beverly Park Woolf,et al.  Affective Gendered Learning Companions , 2009, AIED.

[34]  Yanghee Kim,et al.  Pedagogical Agents as Learning Companions: The Role of Agent Competency and Type of Interaction , 2006 .

[35]  Paul Boersma,et al.  Praat, a system for doing phonetics by computer , 2002 .

[36]  Paolo Bifulco,et al.  Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study , 2012, Medical devices.

[37]  José del R. Millán,et al.  Brain-Controlled Wheelchairs: A Robotic Architecture , 2013, IEEE Robotics & Automation Magazine.

[38]  Chi Thanh Vi,et al.  Continuous Tactile Feedback for Motor-Imagery Based Brain-Computer Interaction in a Multitasking Context , 2015, INTERACT.

[39]  Cs Dweck,et al.  Messages that motivate: How praise molds students’ beliefs, motivation, and performance (in surprising ways). Aronson (Ed.), Improving academic achievement: Impact of psychological factors on education (pp. ). Amsterdam: Academic Press. , 2002 .

[40]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[41]  R. Plutchik The Nature of Emotions , 2001 .

[42]  C. Neuper,et al.  Whatever Works: A Systematic User-Centered Training Protocol to Optimize Brain-Computer Interfacing Individually , 2013, PloS one.

[43]  Christian Mühl,et al.  Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design , 2013, Front. Hum. Neurosci..

[44]  Aleksandar Milenkovic,et al.  Journal of Neuroengineering and Rehabilitation Open Access a Wireless Body Area Network of Intelligent Motion Sensors for Computer Assisted Physical Rehabilitation , 2005 .

[45]  N. Palomero-Gallagher,et al.  Social reward improves the voluntary control over localized brain activity in fMRI-based neurofeedback training , 2015, Front. Behav. Neurosci..

[46]  J. Wolpaw,et al.  Brain-computer communication: unlocking the locked in. , 2001, Psychological bulletin.

[47]  Sylvie Pesty,et al.  Applying Affective Tactics for a Better Learning , 2004, ECAI.

[48]  Rémi Bachelet,et al.  The EduFlow Model: A Contribution Toward the Study of Optimal Learning Environments , 2016 .

[49]  B. Weiner,et al.  Theories and principles of motivation. , 1996 .

[50]  A. Isen,et al.  Positive affect facilitates creative problem solving , 1987 .

[51]  N. Sadato,et al.  Processing of Social and Monetary Rewards in the Human Striatum , 2008, Neuron.

[52]  Fabien Lotte,et al.  Towards a Cognitive Model of MI-BCI User Training , 2017, GBCIC.

[53]  Jacqueline Bourdeau,et al.  Advances in Intelligent Tutoring Systems , 2010 .

[54]  Cuntai Guan,et al.  Learning from other subjects helps reducing Brain-Computer Interface calibration time , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[55]  Jérémy Frey,et al.  Teegi: tangible EEG interface , 2014, UIST.

[56]  Brendan Z. Allison,et al.  Could Anyone Use a BCI? , 2010, Brain-Computer Interfaces.

[57]  Jonathan R Wolpaw,et al.  Brain–computer interface use is a skill that user and system acquire together , 2018, PLoS biology.

[58]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[59]  Anatole Lécuyer,et al.  Author manuscript, published in "IEEE Transactions on Computational Intelligence and AI in games (2013)" Two Brains, One Game: Design and Evaluation of a Multi-User BCI Video Game Based on Motor Imagery , 2022 .

[60]  Christa Neuper,et al.  Neurofeedback Training for BCI Control , 2009 .

[61]  Shaobo Huang,et al.  How to train your DragonBot: Socially assistive robots for teaching children about nutrition through play , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[62]  David W. Johnson,et al.  An Educational Psychology Success Story: Social Interdependence Theory and Cooperative Learning , 2009 .

[63]  Guillaume Gibert,et al.  OpenViBE: An Open-Source Software Platform to Design, Test, and Use BrainComputer Interfaces in Real and Virtual Environments , 2010, PRESENCE: Teleoperators and Virtual Environments.

[64]  J. Millán,et al.  Brain-actuated functional electrical stimulation elicits lasting arm motor recovery after stroke , 2018, Nature Communications.

[65]  James C. Lester,et al.  The persona effect: affective impact of animated pedagogical agents , 1997, CHI.