The cerebral bases of the bouba-kiki effect

&NA; The crossmodal correspondence between some speech sounds and some geometrical shapes, known as the bouba‐kiki (BK) effect, constitutes a remarkable exception to the general arbitrariness of the links between word meaning and word sounds. We have analyzed the association of shapes and sounds in order to determine whether it occurs at a perceptual or at a decisional level, and whether it takes place in sensory cortices or in supramodal regions. First, using an Implicit Association Test (IAT), we have shown that the BK effect may occur without participants making any explicit decision relative to sound‐shape associations. Second, looking for the brain correlates of implicit BK matching, we have found that intermodal matching influences activations in both auditory and visual sensory cortices. Moreover, we found stronger prefrontal activation to mismatching than to matching stimuli, presumably reflecting a modulation of executive processes by crossmodal correspondence. Thus, through its roots in the physiology of object categorization and crossmodal matching, the BK effect provides a unique insight into some non‐linguistic components of word formation.

[1]  R. Goebel,et al.  Integration of Letters and Speech Sounds in the Human Brain , 2004, Neuron.

[2]  J. Bisley The neural basis of visual attention , 2011, The Journal of physiology.

[3]  Michael S. Beauchamp,et al.  A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion , 2012, NeuroImage.

[4]  L. Nygaard,et al.  Neural basis of the crossmodal correspondence between auditory pitch and visuospatial elevation , 2018, Neuropsychologia.

[5]  Joost X. Maier,et al.  Natural, Metaphoric, and Linguistic Auditory Direction Signals Have Distinct Influences on Visual Motion Processing , 2009, The Journal of Neuroscience.

[6]  S. Sawilowsky New Effect Size Rules of Thumb , 2009 .

[7]  Adrian K. C. Lee,et al.  Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms , 2016, Trends in Neurosciences.

[8]  U. Noppeney,et al.  Perceptual Decisions Formed by Accumulation of Audiovisual Evidence in Prefrontal Cortex , 2010, The Journal of Neuroscience.

[9]  Mikko Sams,et al.  Processing of audiovisual speech in Broca's area , 2005, NeuroImage.

[10]  A. Ghazanfar,et al.  Is neocortex essentially multisensory? , 2006, Trends in Cognitive Sciences.

[11]  Ophelia Deroy,et al.  Why we are not all synesthetes (not even weakly so) , 2013, Psychonomic Bulletin & Review.

[12]  Simon Kirby,et al.  Phonological and orthographic influences in the bouba–kiki effect , 2017, Psychological research.

[13]  Carol A. Seger,et al.  Category learning in the brain. , 2010, Annual review of neuroscience.

[14]  A. Greenwald,et al.  Measuring individual differences in implicit cognition: the implicit association test. , 1998, Journal of personality and social psychology.

[15]  Rainer Goebel,et al.  The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. , 2006, Cerebral cortex.

[16]  Ophelia Deroy,et al.  How automatic are crossmodal correspondences? , 2013, Consciousness and Cognition.

[17]  Nicholas A. Steinmetz,et al.  Top-down control of visual attention , 2010, Current Opinion in Neurobiology.

[18]  Anne Treisman,et al.  Natural cross-modal mappings between visual and auditory features. , 2011, Journal of vision.

[19]  Stephen D. Mayhew,et al.  Article Learning Shapes the Representation of Behavioral Choice in the Human Brain , 2022 .

[20]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[21]  Charles Spence,et al.  Multisensory synesthetic interactions in the speeded classification of visual size , 2006, Perception & psychophysics.

[22]  Ophelia Deroy,et al.  Multisensory constraints on awareness , 2014, Philosophical Transactions of the Royal Society B: Biological Sciences.

[23]  Karl J. Friston,et al.  A critique of functional localisers , 2006, NeuroImage.

[24]  C. Spence Crossmodal correspondences: A tutorial review , 2011, Attention, perception & psychophysics.

[25]  Lizabeth M. Romanski,et al.  Responses of Prefrontal Multisensory Neurons to Mismatching Faces and Vocalizations , 2014, The Journal of Neuroscience.

[26]  Laura M. Getz,et al.  Questioning the automaticity of audiovisual correspondences , 2018, Cognition.

[27]  W. Köhler Gestalt Psychology: An Introduction to New Concepts in Modern Psychology , 1970 .

[28]  R D Tarte,et al.  Phonetic Symbolism in Adult Native Speakers of English: Three Studies , 1971, Language and speech.

[29]  Christof Koch,et al.  Multisensory Integration in Complete Unawareness , 2014, Psychological science.

[30]  C. Spence,et al.  What drives sound symbolism? Different acoustic cues underlie sound-size and sound-shape mappings , 2017, Scientific Reports.

[31]  D. Maurer,et al.  The shape of boubas: sound-shape correspondences in toddlers and adults. , 2006, Developmental science.

[32]  N. Tzourio-Mazoyer,et al.  Automated Anatomical Labeling of Activations in SPM Using a Macroscopic Anatomical Parcellation of the MNI MRI Single-Subject Brain , 2002, NeuroImage.

[33]  L. Nygaard,et al.  Cross-linguistic sound symbolism and crossmodal correspondence: Evidence from fMRI and DTI , 2014, Brain and Language.

[34]  Anthony G. Greenwald,et al.  Psychology data from the Race Implicit Association Test on the Project Implicit Demo website , 2014 .

[35]  Karl J. Friston,et al.  The effect of prior visual information on recognition of speech and sounds. , 2008, Cerebral cortex.

[36]  C. Spence,et al.  “Bouba” and “Kiki” in Namibia? A remote culture make similar shape–sound matches, but different shape–taste matches to Westerners , 2013, Cognition.

[37]  Charles Spence,et al.  Audiovisual crossmodal correspondences and sound symbolism: a study using the implicit association test , 2012, Experimental Brain Research.

[38]  L. Shams,et al.  The Brain’s Tendency to Bind Audiovisual Signals Is Stable but Not General , 2016, Psychological science.

[39]  Karen B. Schloss,et al.  Music–color associations are mediated by emotion , 2013, Proceedings of the National Academy of Sciences.

[40]  A. Rich,et al.  Cross-Modality Correspondence between Pitch and Spatial Location Modulates Attentional Orienting , 2012, Perception.

[41]  V. Ramachandran,et al.  Synaesthesia? A window into perception, thought and language , 2001 .

[42]  Sharon Peperkamp,et al.  Consonants are More Important than Vowels in the Bouba-kiki Effect , 2015, Language and speech.

[43]  Rainer Goebel,et al.  The sound of size Crossmodal binding in pitch-size synesthesia: A combined TMS, EEG and psychophysics study , 2012, NeuroImage.

[44]  M. Banaji,et al.  The implicit revolution: Reconceiving the relation between conscious and unconscious. , 2017, The American psychologist.

[45]  Annette D'Onofrio,et al.  Phonetic Detail and Dimensionality in Sound-shape Correspondences: Refining the Bouba-Kiki Paradigm , 2014 .

[46]  Russell A. Poldrack,et al.  Guidelines for reporting an fMRI study , 2008, NeuroImage.

[47]  C. Spence,et al.  When “Bouba” equals “Kiki”: Cultural commonalities and cultural differences in sound-shape correspondences , 2016, Scientific Reports.

[48]  David J. Freedman,et al.  A Comparison of Primate Prefrontal and Inferior Temporal Cortices during Visual Categorization , 2003, The Journal of Neuroscience.

[49]  K. Munhall,et al.  Forty Years After Hearing Lips and Seeing Voices: the McGurk Effect Revisited. , 2018, Multisensory research.

[50]  Lars Muckli,et al.  Cortical Plasticity of Audio–Visual Object Representations , 2008, Cerebral cortex.

[51]  Elia Formisano,et al.  Multisensory Integration in Speech Processing: Neural Mechanisms of Cross-Modal Aftereffects , 2017 .

[52]  L. Cohen,et al.  The audiovisual structure of onomatopoeias: An intrusion of real-world physics in lexical creation , 2018, PloS one.

[53]  C. Spence,et al.  Assessing the Role of the ‘Unity Assumption’ on Multisensory Integration: A Review , 2017, Front. Psychol..

[54]  W. K. Simmons,et al.  Circular analysis in systems neuroscience: the dangers of double dipping , 2009, Nature Neuroscience.

[55]  P. Mamassian,et al.  Multisensory processing in review: from physiology to behaviour. , 2010, Seeing and perceiving.

[56]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[57]  S. Lacey,et al.  Synesthesia strengthens sound‐symbolic cross‐modal correspondences , 2016, The European journal of neuroscience.

[58]  Lawrence E. Marks,et al.  Synesthesia: Strong and Weak , 2001 .

[59]  C. Koch,et al.  Information integration without awareness , 2014, Trends in Cognitive Sciences.

[60]  C. Spence,et al.  I know that “Kiki” is angular: The metacognition underlying sound–shape correspondences , 2018, Psychonomic Bulletin & Review.

[61]  William A. Cunningham,et al.  Performance on Indirect Measures of Race Evaluation Predicts Amygdala Activation , 2000, Journal of Cognitive Neuroscience.

[62]  J. Kaiser,et al.  Object Familiarity and Semantic Congruency Modulate Responses in Cortical Audiovisual Integration Areas , 2007, The Journal of Neuroscience.