BrainPredict: a Tool for Predicting and Visualising Local Brain Activity

In this paper, we present a tool allowing dynamic prediction and visualization of an individual's local brain activity during a conversation. The prediction module of this tool is based on classifiers trained using a corpus of human-human and human-robot conversations including fMRI recordings. More precisely, the module takes as input behavioral features computed from raw data, mainly the participant and the interlocutor speech but also the participant's visual input and eye movements. The visualisation module shows in real-time the dynamics of brain active areas synchronised with the behavioral raw data. In addition, it shows which integrated behavioral features are used to predict the activity in individual brain areas.

[1]  Christian O'Reilly,et al.  Visbrain: A Multi-Purpose GPU-Accelerated Open-Source Suite for Multimodal Brain Data Visualization , 2019, Front. Neuroinform..

[2]  Thomas G. Dietterich Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms , 1998, Neural Computation.

[3]  P. Ekman,et al.  Measuring facial movement , 1976 .

[4]  Walter Daelemans,et al.  Pattern for Python , 2012, J. Mach. Learn. Res..

[5]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[6]  Yu Zhang,et al.  The Human Brainnetome Atlas: A New Brain Atlas Based on Connectional Architecture , 2016, Cerebral cortex.

[7]  Thomas L. Griffiths,et al.  Supplementary Information for Natural Speech Reveals the Semantic Maps That Tile Human Cerebral Cortex , 2022 .

[8]  Marian Stewart Bartlett,et al.  Classifying Facial Action , 1995, NIPS.

[9]  V. Michel,et al.  Recruitment of an Area Involved in Eye Movements During Mental Arithmetic , 2009, Science.

[10]  John Ashburner,et al.  A fast diffeomorphic image registration algorithm , 2007, NeuroImage.

[11]  Tobias U. Hauser,et al.  The PhysIO Toolbox for Modeling Physiological Noise in fMRI Data , 2017, Journal of Neuroscience Methods.

[12]  Magalie Ochs,et al.  Multimodal Corpus of Bidirectional Conversation of Human-human and Human-robot Interaction during fMRI Scanning , 2020, LREC.

[13]  Andrea Bergmann,et al.  Statistical Parametric Mapping The Analysis Of Functional Brain Images , 2016 .

[14]  Philippe Blache,et al.  Toward an Automatic Prediction of the Sense of Presence in Virtual Reality Environment , 2018, HAI.

[15]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[16]  Julia Hirschberg,et al.  Affirmative Cue Words in Task-Oriented Dialogue , 2012, CL.

[17]  Tom Michael Mitchell,et al.  Predicting Human Brain Activity Associated with the Meanings of Nouns , 2008, Science.

[18]  Susan L. Whitfield-Gabrieli,et al.  Conn: A Functional Connectivity Toolbox for Correlated and Anticorrelated Brain Networks , 2012, Brain Connect..

[19]  Brigitte Bigi,et al.  SPPAS - MULTI-LINGUAL APPROACHES TO THE AUTOMATIC ANNOTATION OF SPEECH , 2015 .

[20]  Andrei Popescu-Belis,et al.  What are discourse markers ? , 2003 .

[21]  L. Fahrmeir,et al.  Bayesian Modeling of the Hemodynamic Response Function in BOLD fMRI , 2001, NeuroImage.

[22]  Magalie Ochs,et al.  Brain activity during reciprocal social interaction investigated using conversational robots as control condition , 2019, Philosophical Transactions of the Royal Society B.

[23]  Ochs Magalie,et al.  Toward an Automatic Prediction of the Sense of Presence in Virtual Reality Environment , 2018, Proceedings of the 6th International Conference on Human-Agent Interaction.