Gender and emotion recognition with implicit user signals

We examine the utility of implicit user behavioral signals captured using low-cost, off-the-shelf devices for anonymous gender and emotion recognition. A user study designed to examine male and female sensitivity to facial emotions confirms that females recognize (especially negative) emotions quicker and more accurately than men, mirroring prior findings. Implicit viewer responses in the form of EEG brain signals and eye movements are then examined for existence of (a) emotion and gender-specific patterns from event-related potentials (ERPs) and fixation distributions and (b) emotion and gender discriminability. Experiments reveal that (i) Gender and emotion-specific differences are observable from ERPs, (ii) multiple similarities exist between explicit responses gathered from users and their implicit behavioral signals, and (iii) Significantly above-chance (≈70%) gender recognition is achievable on comparing emotion-specific EEG responses– gender differences are encoded best for anger and disgust. Also, fairly modest valence (positive vs negative emotion) recognition is achieved with EEG and eye-based features.

[1]  P. Lang International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .

[2]  Wenyao Xu,et al.  Human Gender Classification: A Review , 2015, Int. J. Biom..

[3]  David I. Perrett,et al.  Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? , 2005, Cognitive Processing.

[4]  G. Wilson,et al.  Sex, sexual orientation, and identification of positive and negative facial affect , 2004, Brain and Cognition.

[5]  N. Sebe,et al.  Emotion modulates eye movement patterns and subsequent memory for the gist and details of movie scenes. , 2014, Journal of vision.

[6]  J. N. Bassili Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face. , 1979, Journal of personality and social psychology.

[7]  S. Muthukumaraswamy High-frequency brain activity and muscle artifacts in MEG/EEG: a review and recommendations , 2013, Front. Hum. Neurosci..

[8]  Patrizio Campisi,et al.  Brain waves for automatic biometric-based user recognition , 2014, IEEE Transactions on Information Forensics and Security.

[9]  Robert J. K. Jacob,et al.  Using fNIRS brain sensing to evaluate information visualization interfaces , 2013, CHI.

[10]  Chrysa D. Lithari,et al.  Are Females More Responsive to Emotional Stimuli? A Neurophysiological Study Across Arousal and Valence Dimensions , 2009, Brain Topography.

[11]  D. Song,et al.  EEG Based Emotion Identification Using Unsupervised Deep Feature Learning , 2015 .

[12]  Monique Ernst,et al.  A developmental examination of gender differences in brain engagement during evaluation of threat , 2004, Biological Psychiatry.

[13]  Shrikanth S. Narayanan,et al.  Toward detecting emotions in spoken dialogs , 2005, IEEE Transactions on Speech and Audio Processing.

[14]  Abhinav Shukla,et al.  Affect Recognition in Ads with Application to Computational Advertising , 2017, ACM Multimedia.

[15]  Stefan Winkler,et al.  A Probabilistic Approach to People-Centric Photo Selection and Sequencing , 2017, IEEE Transactions on Multimedia.

[16]  Stefan Winkler,et al.  ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors , 2018, IEEE Transactions on Affective Computing.

[17]  Desney S. Tan,et al.  Women take a wider view , 2002, CHI.

[18]  Yang Wei-we,et al.  A Review on , 2008 .

[19]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[20]  Sinmisola Ogunyinka What's good for the goose , 2014 .

[21]  Ran R. Hassin,et al.  Angry, Disgusted, or Afraid? , 2008, Psychological science.

[22]  Subramanian Ramanathan,et al.  DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses , 2015, IEEE Transactions on Affective Computing.

[23]  Wenhao Zhang,et al.  Gender and gaze gesture recognition for human-computer interaction , 2016, Comput. Vis. Image Underst..

[24]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[25]  Olga Sourina,et al.  MIND - An EEG Neurofeedback Multitasking Game , 2015, 2015 International Conference on Cyberworlds (CW).

[26]  Rosalind W. Picard Affective computing: (526112012-054) , 1997 .

[27]  Wanli Ma,et al.  Age and gender classification using EEG paralinguistic features , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[28]  Su Yang,et al.  On the Effectiveness of EEG Signals as a Source of Biometric Information , 2012, 2012 Third International Conference on Emerging Security Technologies.

[29]  Harish Katti,et al.  Making computers look the way we look: exploiting visual attention for image understanding , 2010, ACM Multimedia.

[30]  Shaibal Barua,et al.  A Review on Machine Learning Algorithms in Handling EEG Artifacts , 2014 .

[31]  Shrikanth S. Narayanan,et al.  Automatic speaker age and gender recognition using acoustic and prosodic level information fusion , 2013, Comput. Speech Lang..

[32]  Subramanian Ramanathan,et al.  Discovering gender differences in facial emotion recognition via implicit behavioral cues , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).

[33]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[34]  Marzia Del Zotto,et al.  Processing of masked and unmasked emotional faces under different attentional conditions: an electrophysiological investigation , 2015, Front. Psychol..

[35]  John J. Magee,et al.  Categorical perception of facial expressions , 1992, Cognition.

[36]  A. Gale,et al.  Eye Movement Strategies Involved in Face Perception , 1977, Perception.

[37]  Susan Sullivan,et al.  What’s good for the goose is not good for the gander: Age and gender differences in scanning emotion faces , 2017, The journals of gerontology. Series B, Psychological sciences and social sciences.

[38]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[39]  Skyler T. Hawk,et al.  Presentation and validation of the Radboud Faces Database , 2010 .

[40]  Garrison W. Cottrell,et al.  Transmitting and Decoding Facial Expressions , 2005, Psychological science.

[41]  Wei Liu,et al.  Multimodal Emotion Recognition Using Multimodal Deep Learning , 2016, ArXiv.

[42]  Loong Fah Cheong,et al.  Affective understanding in film , 2006, IEEE Trans. Circuits Syst. Video Technol..

[43]  David Passig,et al.  The Interaction between Gender, Age, and Multimedia Interface Design , 2001, Education and Information Technologies.

[44]  Janne Heikkilä,et al.  Predicting the Valence of a Scene from Observers’ Eye Movements , 2015, PloS one.

[45]  Jan L. Plass,et al.  Gender and player characteristics in video game play of preadolescents , 2012, Comput. Hum. Behav..

[46]  Margaret M. Burnett,et al.  Gender HCI: What About the Software? , 2006, Computer.

[47]  Joan Y. Chiao,et al.  Eye movements during emotion recognition in faces. , 2014, Journal of vision.

[48]  A. Nijholt,et al.  A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges , 2014 .

[49]  Bao-Liang Lu,et al.  Multimodal emotion recognition using EEG and eye tracking data , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[50]  Igor Dolgov,et al.  Gender and Personality Trait Measures Impact Degree of Affect Change in a Hedonic Computing Paradigm , 2013, Int. J. Hum. Comput. Interact..

[51]  M. Morgan,et al.  Sex differences in scanning faces: Does attention to the eyes explain female superiority in facial expression recognition? , 2010 .

[52]  Edwin S. Dalmaijer,et al.  Is the low-cost EyeTribe eye tracker any good for research? , 2014 .

[53]  Bok-Min Goi,et al.  Vision-based Human Gender Recognition: A Survey , 2012, ArXiv.

[54]  Judith A. Hall,et al.  Gender differences in judgments of multiple emotions from facial expressions. , 2004, Emotion.

[55]  N. Chumerin,et al.  Designing a brain-computer interface controlled video-game using consumer grade EEG hardware , 2012, 2012 ISSNIP Biosignals and Biorobotics Conference: Biosignals and Robotics for Better and Safer Living (BRC).

[56]  Wei Liu,et al.  Emotion Recognition Using Multimodal Deep Learning , 2016, ICONIP.

[57]  Byron Nakos,et al.  EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification , 2014 .

[58]  Chi Thanh Vi,et al.  Error related negativity in observing interactive tasks , 2014, CHI.

[59]  Ioannis Patras,et al.  Fusion of facial expressions and EEG for implicit affective tagging , 2013, Image Vis. Comput..

[60]  Nicu Sebe,et al.  Looking at the viewer: analysing facial activity to detect personal highlights of multimedia contents , 2010, Multimedia Tools and Applications.

[61]  Theo Gasser,et al.  Correction of muscle artefacts in the EEG power spectrum , 2005, Clinical Neurophysiology.

[62]  Subramanian Ramanathan,et al.  Can computers learn from humans to see better?: inferring scene semantics from viewers' eye movements , 2011, ACM Multimedia.

[63]  Samia Nefti-Meziani,et al.  Predicting the Valence of a Scene from Observers’ Eye Movements , 2015, PloS one.

[64]  Yong Peng,et al.  EEG-based emotion classification using deep belief networks , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).