A Novel Oddball Paradigm for Affective BCIs Using Emotional Faces as Stimuli

The studies of P300-based brain computer interfaces (BCIs) have demonstrated that visual attention to an oddball event can enhance the event-related potential (ERP) time-locked to this event. However, it was unclear that whether the more sophisticated face-evoked potentials can also be modulated by related mental tasks. This study examined ERP responses to objects, faces, and emotional faces when subjects performs visual attention, face recognition and categorization of emotional facial expressions respectively in an oddball paradigm. The results revealed the significant difference between target and non-target ERPs for each paradigm. Furthermore, the significant difference among three mental tasks was observed for vertex-positive potential (VPP) (p<0.01), late positive potential (LPP) / P3b (p<0.05) at the centro-parietal regions and N250 (p<0.003) at the occipito-temporal regions. The high classification performance for single-trial emotional face-related ERP demonstrated facial emotion processing can be used as a novel oddball paradigm for the affective BCIs.

[1]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[2]  Stefan Robert Schweinberger,et al.  Face and object encoding under perceptual load: ERP evidence , 2011, NeuroImage.

[3]  D. Yao,et al.  An Enhanced Probabilistic LDA for Multi-Class Brain Computer Interface , 2011, PloS one.

[4]  S. Bentin,et al.  Domain specificity versus expertise: factors influencing distinct processing of faces , 2002, Cognition.

[5]  Tracy A. Dennis,et al.  Emotional face processing and attention performance in three domains: neurophysiological mechanisms and moderating effects of trait anxiety. , 2007, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[6]  A. Burton,et al.  N250r: a face-selective brain response to stimulus repetitions , 2004, Neuroreport.

[7]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[8]  Hossein Esteky,et al.  A study of N250 event-related brain potential during face and non-face detection tasks. , 2009, Journal of vision.

[9]  Genevieve McArthur,et al.  Switching associations between facial identity and emotional expression: A behavioural and ERP study , 2010, NeuroImage.

[10]  Wenfeng Feng,et al.  Three stages of facial expression processing: ERP study with rapid serial visual presentation , 2010, NeuroImage.

[11]  S M M Martens,et al.  Overlap and refractory effects in a brain–computer interface speller based on the visual P300 event-related potential , 2009, Journal of neural engineering.

[12]  J. R. Wolpaw,et al.  ' s personal copy A novel P 300-based brain – computer interface stimulus presentation paradigm : Moving beyond rows and columns q , 2010 .

[13]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[14]  Talma Hendler,et al.  The validity of the face-selective ERP N170 component during simultaneous recording with functional MRI , 2008, NeuroImage.

[15]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[16]  A. Lenhardt,et al.  An Adaptive P300-Based Online Brain–Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[17]  Yang Seok Cho,et al.  Neural correlates of top–down processing in emotion perception: An ERP study of emotional faces in white noise versus noise-alone stimuli , 2010, Brain Research.