The not face: A grammaticalization of facial expressions of emotion

Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers.

[1]  P. Ekman Universals and cultural differences in facial expressions of emotion. , 1972 .

[2]  P. MacNeilage,et al.  The frame/content theory of evolution of speech production , 1998, Behavioral and Brain Sciences.

[3]  R. Wilbur,et al.  The Duration of Syllables in American Sign Language , 1986, Language and speech.

[4]  Noam Chomsky,et al.  How Could Language Have Evolved? , 2014, PLoS biology.

[5]  Patricia Siple,et al.  Understanding language through sign language research , 1978 .

[6]  Yong Tao,et al.  Compound facial expressions of emotion , 2014, Proceedings of the National Academy of Sciences.

[7]  Traci Weast Questions in American Sign Language: A quantitative analysis of raised and lowered eyebrows (The University of Texas at Arlington, 2008) , 2008 .

[8]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[9]  D. Matsumoto,et al.  Idiocentric and allocentric differences in emotional expression, experience, and the coherence between expression and experience , 2001 .

[10]  Adam Kendon,et al.  Some uses of the head shake , 2002 .

[11]  Ronnie B. Wilbur,et al.  A Linguistic Analysis of the Negative Headshake in American Sign Language , 1990 .

[12]  Denise Brandão de Oliveira e Britto,et al.  The faculty of language , 2007 .

[13]  Julie D. Golomb,et al.  A Neural Basis of Facial Action Recognition in Humans , 2016, The Journal of Neuroscience.

[14]  Ronnie B. Wilbur,et al.  Discriminant Features and Temporal Structure of Nonmanuals in American Sign Language , 2014, PloS one.

[15]  Alfred Freedman,et al.  No and Yes: On the Genesis of Human Communication , 1959 .

[16]  Noam Chomsky,et al.  Poverty of the Stimulus Revisited , 2011, Cogn. Sci..

[17]  Asif A. Ghazanfar,et al.  The Natural Statistics of Audiovisual Speech , 2009, PLoS Comput. Biol..

[18]  Yiannis Aloimonos,et al.  The minimalist grammar of action , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.

[19]  J. Weisberg,et al.  Simultaneous perception of a spoken and a signed language: The brain basis of ASL-English code-blends , 2015, Brain and Language.

[20]  P. Ekman,et al.  Pan-Cultural Elements in Facial Displays of Emotion , 1969, Science.

[21]  U. Bellugi,et al.  The acquisition of conditionals in American Sign Language: Grammaticized facial expressions , 1990, Applied Psycholinguistics.

[22]  P. Rozin,et al.  The CAD triad hypothesis: a mapping between three moral emotions (contempt, anger, disgust) and three moral codes (community, autonomy, divinity). , 1999, Journal of personality and social psychology.

[23]  Richard A. Shweder,et al.  The "big three" of morality (autonomy, community, divinity) and the "big three" explanations of suffering. , 1997 .

[24]  P. Schyns,et al.  Speech Rhythms and Multiplexed Oscillatory Sensory Coding in the Human Brain , 2013, PLoS biology.

[25]  M. Bartlett,et al.  Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions , 2014, Current Biology.

[26]  Ulrike Zeshan,et al.  Interrogative and negative constructions in sign languages , 2006 .

[27]  C. Darwin The Expression of the Emotions in Man and Animals , .

[28]  Kadir Gökgöz,et al.  Negation in Turkish Sign Language: The syntax of nonmanual markers , 2011 .

[29]  Aleix M. Martínez,et al.  EmotioNet: An Accurate, Real-Time Algorithm for the Automatic Annotation of a Million Facial Expressions in the Wild , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[30]  Yuri Miyamoto,et al.  Heterogeneity of long-history migration explains cultural differences in reports of emotional expressivity and the functions of smiles , 2015, Proceedings of the National Academy of Sciences.

[31]  Christoph Kayser,et al.  Monkeys are perceptually tuned to facial expressions that exhibit a theta-like speech rhythm , 2013, Proceedings of the National Academy of Sciences.

[32]  D Bavelier,et al.  Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[33]  Noam Chomsky,et al.  The faculty of language: what is it, who has it, and how did it evolve? , 2002, Science.

[34]  Roland Pfau,et al.  The grammaticalization of headshakes: From head movement to negative head , 2015 .

[35]  K. Hashikawa,et al.  Sign language ‘heard’ in the auditory cortex , 1999, Nature.

[36]  Alice J. O'Toole,et al.  Comparing face recognition algorithms to humans on challenging tasks , 2012, TAP.

[37]  Aleix M. Martínez,et al.  A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives , 2012, J. Mach. Learn. Res..

[38]  Joshua D. Greene,et al.  How (and where) does moral judgment work? , 2002, Trends in Cognitive Sciences.

[39]  J. Zwart The Minimalist Program , 1998, Journal of Linguistics.

[40]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[41]  W. Strange Evolution of language. , 1984, JAMA.

[42]  Laszlo A. Jeni,et al.  Spontaneous facial expression in unscripted social interactions can be measured automatically , 2015, Behavior research methods.

[43]  E. Bullmore,et al.  Activation of auditory cortex during silent lipreading. , 1997, Science.

[44]  Noam Chomsky,et al.  The faculty of language: what is it, who has it, and how did it evolve? , 2002 .

[45]  E. Klima The signs of language , 1979 .