Orientation information in encoding facial expressions

ABSTRACT Previous research showed that we use different regions of a face to categorize different facial expressions, e.g. mouth region for identifying happy faces; eyebrows, eyes and upper part of nose for identifying angry faces. These findings imply that the spatial information along or close to the horizontal orientation might be more useful than others for facial expression recognition. In this study, we examined how the performance for recognizing facial expression depends on the spatial information along different orientations, and whether the pixel‐level differences in the face images could account for subjects’ performance. Four facial expressions—angry, fearful, happy and sad—were tested. An orientation filter (bandwidth=23°) was applied to restrict information within the face images, with the center of the filter ranged from 0° (horizontal) to 150° in steps of 30°. Accuracy for recognizing facial expression was measured for an unfiltered and the six filtered conditions. For all four facial expressions, recognition performance (normalized d′) was virtually identical for filter orientations of −30°, horizontal and 30°, and declined systematically as the filter orientation approached vertical. The information contained in mouth and eye regions is a significant predictor for subject’s response (based on the confusion patterns). We conclude that young adults with normal vision categorizes facial expression most effectively based on the spatial information around the horizontal orientation which captures primary changes of facial features across expressions. Across all spatial orientations, the information contained in mouth and eye regions contributes significantly to facial expression categorization.

[1]  M. Hasselmo,et al.  The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey , 1989, Behavioural Brain Research.

[2]  Steven C. Dakin,et al.  Horizontal Information Drives the Behavioral Signatures of Face Processing , 2010, Front. Psychology.

[3]  Garrison W. Cottrell,et al.  Transmitting and Decoding Facial Expressions , 2005, Psychological science.

[4]  J. Tanaka,et al.  The NimStim set of facial expressions: Judgments from untrained research participants , 2009, Psychiatry Research.

[5]  V Bruce,et al.  Configural Features in the Context of Upright and Inverted Faces , 2001, Perception.

[6]  A. Oliva,et al.  Dr. Angry and Mr. Smile: when categorization flexibly modifies the perception of faces in rapid visual presentations , 1999, Cognition.

[7]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[8]  Justin Duncan,et al.  Orientations for the successful categorization of facial expressions and their link with facial features. , 2017, Journal of vision.

[9]  J. Haxby,et al.  The distributed human neural system for face perception , 2000, Trends in Cognitive Sciences.

[10]  John A. Greenwood,et al.  The orientation selectivity of face identification , 2016, Scientific Reports.

[11]  Susana T. L. Chung,et al.  Critical Orientation for Face Identification in Central Vision Loss , 2011, Optometry and vision science : official publication of the American Academy of Optometry.

[12]  A. Young,et al.  Configural information in facial expression perception. , 2000, Journal of experimental psychology. Human perception and performance.

[13]  D. Hubel,et al.  Receptive fields and functional architecture of monkey striate cortex , 1968, The Journal of physiology.

[14]  R. Dolan,et al.  fMRI-adaptation reveals dissociable neural representations of identity and expression in face perception. , 2004, Journal of neurophysiology.

[15]  P. Ekman,et al.  Unmasking the Face: A Guide to Recognizing Emotions From Facial Expressions , 1975 .

[16]  A. Young,et al.  Understanding face recognition. , 1986, British journal of psychology.

[17]  R. Watt,et al.  Biological "bar codes" in human faces. , 2009, Journal of vision.

[18]  R. Dolan,et al.  Distinct spatial frequency sensitivities for processing faces and emotional expressions , 2003, Nature Neuroscience.

[19]  D. Massaro,et al.  Featural evaluation, integration, and judgment of facial affect. , 1997, Journal of experimental psychology. Human perception and performance.

[20]  Benjamin Balas,et al.  Emotion recognition (sometimes) depends on horizontal orientations , 2013 .

[21]  Deyue Yu,et al.  Orientation Information in Encoding Facial Expressions , 2011 .

[22]  D. G. Albrecht,et al.  Spatial frequency selectivity of cells in macaque visual cortex , 1982, Vision Research.

[23]  Garrison W. Cottrell,et al.  A New Angle on the EMPATH Model: Spatial Frequency Orientation in Recognition of Facial Expressions , 2012, CogSci.

[24]  G. Cottrell,et al.  EMPATH: A Neural Network that Categorizes Facial Expressions , 2002, Journal of Cognitive Neuroscience.

[25]  D G Pelli,et al.  The VideoToolbox software for visual psychophysics: transforming numbers into movies. , 1997, Spatial vision.

[26]  H Stanislaw,et al.  Calculation of signal detection theory measures , 1999, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.