Modeling Socio-Emotional and Cognitive Processes from Multimodal Data in the Wild

Detecting, modeling, and making sense of multimodal data from human users in the wild still poses numerous challenges. Starting from aspects of data quality and reliability of our measurement instruments, the multidisciplinary endeavor of developing intelligent adaptive systems in human-computer or human-robot interaction (HCI, HRI) requires a broad range of expertise and more integrative efforts to make such systems reliable, engaging, and user-friendly. At the same time, the spectrum of applications for machine learning and modeling of multimodal data in the wild keeps expanding. From the classroom to the robot-assisted operation theatre, our workshop aims to support a vibrant exchange about current trends and methods in the field of modeling multimodal data in the wild.

[1]  Tanja Schultz,et al.  Model-based Prediction of Exogeneous and Endogeneous Attention Shifts During an Everyday Activity , 2020, ICMI Companion.

[2]  Jauwairia Nasir,et al.  "You Tell, I Do, and We Swap until we Connect All the Gold Mines!" , 2020, ERCIM News.

[3]  Felix Putze,et al.  EEG-Based Classification of Internally- and Externally-Directed Attention in an Augmented Reality Paradigm , 2019, Front. Hum. Neurosci..

[4]  Tanja Schultz,et al.  Modeling Cognitive Processes from Multimodal Signals , 2018, ICMI.

[5]  Daniel C. Tozadore,et al.  Multimodal Fuzzy Assessment for Robot Behavioral Adaptation in Educational Children-Robot Interaction , 2020, ICMI Companion.

[6]  Duygun Erol,et al.  Emotion Recognition using EEG and Physiological Data for Robot-Assisted Rehabilitation Systems , 2020, ICMI Companion.

[7]  J. Cacioppo,et al.  Principles of psychophysiology : physical, social, and inferential elements , 1990 .

[8]  Fabio Tesser,et al.  Multimodal child-robot interaction: building social bonds , 2013, HRI 2013.

[9]  Arvid Kappas,et al.  Psychological Science in HRI: Striving for a More Integrated Field of Research , 2016, AAAI Fall Symposia.

[10]  Felix Putze,et al.  Attention-Aware Brain Computer Interface to Avoid Distractions in Augmented Reality , 2020, CHI Extended Abstracts.

[11]  Arvid Kappas,et al.  What Could a Body Tell a Social Robot that It Does Not Know? , 2018, PhyCS.

[12]  Oliver Lemon,et al.  The MuMMER Project: Engaging Human-Robot Interaction in Real-World Public Spaces , 2016, ICSR.

[13]  Mathias Benedek,et al.  Dozing Off or Thinking Hard?: Classifying Multi-dimensional Attentional States in the Classroom from Video , 2018, ICMI.

[14]  Srinivas Parthasarathy,et al.  Training Strategies to Handle Missing Modalities for Audio-Visual Expression Recognition , 2020, ICMI Companion.

[15]  Helen F. Hastie,et al.  Empathic Robotic Tutors for Personalised Learning: A Multidisciplinary Approach , 2015, ICSR.

[16]  David St-Onge,et al.  Measuring Cognitive Load: Heart-rate Variability and Pupillometry Assessment , 2020, ICMI Companion.

[17]  Tom Ziemke,et al.  Robot-assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook , 2012, Paladyn J. Behav. Robotics.

[18]  Tanja Schultz,et al.  SmartHelm: Towards Multimodal Detection of Attention in an Outdoor Augmented Reality Biking Scenario , 2020, ICMI Companion.