When to engage in interaction — And how? EEG-based enhancement of robot's ability to sense social signals in HRI

Humanoids are to date still limited in reliable interpretation of social cues that humans convey which restricts fluency and naturalness in social human-robot interaction (HRI). We propose a method to read out two important aspects of social engagement directly from the brain of a human interaction partner: (1) the intention to initiate eye contact and (2) the distinction between the observer being initiator or responder of an established gaze contact between human and robot. We suggest that these measures would give humanoids an important means for deciding when (timing) and how (social role) to engage in interaction with a human. We propose an experimental setup using iCub to evoke and capture the respective electrophysiological patterns via electroencephalography (EEG). Data analysis revealed biologically plausible brain activity patterns for both processes of social engagement. By using Support Vector Machine (SVM) classifiers with RBF kernel we showed that these patterns can be modeled with high within-participant accuracies of avg. 80.4% for (1) and avg. 77.0% for (2).

[1]  Giulio Sandini,et al.  The iCub humanoid robot: an open platform for research in embodied cognition , 2008, PerMIS.

[2]  Joy Hirsch,et al.  Functional Specialization within the Medial Frontal Gyrus for Perceptual Go/No-Go Decisions Based on What, When, and Where Related Information: An fMRI Study , 2005, Journal of Cognitive Neuroscience.

[3]  S. Fairclough BCI and Physiological Computing for Computer Games : Differences , Similarities & Intuitive Control , 2009 .

[4]  Xie Song-yun,et al.  Research on the Classification of Brain Function Based on SVM , 2008, 2008 2nd International Conference on Bioinformatics and Biomedical Engineering.

[5]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[6]  Nadim Joni Shah,et al.  Minds Made for Sharing: Initiating Joint Attention Recruits Reward-related Neurocircuitry , 2010, Journal of Cognitive Neuroscience.

[7]  W. Klimesch,et al.  Induced alpha band power changes in the human EEG and attention , 1998, Neuroscience Letters.

[8]  Kerstin Dautenhahn,et al.  Imitation and Social Learning in Robots, Humans and Animals: Social feedback , 2007 .

[9]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[10]  Y. Yamamoto,et al.  Instrument for controlling drowsiness using galvanic skin reflex , 1992, Medical and Biological Engineering and Computing.

[11]  M. Tomasello Why We Cooperate , 2009 .

[12]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[13]  R. Baumeister,et al.  The need to belong: desire for interpersonal attachments as a fundamental human motivation. , 1995, Psychological bulletin.

[14]  S. Baron-Cohen Mindblindness: An Essay on Autism and Theory of Mind , 1997 .

[15]  Alan J. Dix,et al.  Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me (ACE) , 2005, DiGRA Conference.

[16]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .

[17]  Christian Kothe,et al.  Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general , 2011, Journal of neural engineering.