It does belong together: cross-modal correspondences influence cross-modal integration during perceptual learning

Experiencing a stimulus in one sensory modality is often associated with an experience in another sensory modality. For instance, seeing a lemon might produce a sensation of sourness. This might indicate some kind of cross-modal correspondence between vision and gustation. The aim of the current study was to explore whether such cross-modal correspondences influence cross-modal integration during perceptual learning. To that end, we conducted two experiments. Using a speeded classification task, Experiment 1 established a cross-modal correspondence between visual lightness and the frequency of an auditory tone. Using a short-term priming procedure, Experiment 2 showed that manipulation of such cross-modal correspondences led to the creation of a crossmodal unit regardless of the nature of the correspondence (i.e., congruent, Experiment 2a or incongruent, Experiment 2b). However, a comparison of priming effects sizes suggested that cross-modal correspondences modulate cross-modal integration during learning, leading to new learned units that have different stability over time. We discuss the implications of our results for the relation between cross-modal correspondence and perceptual learning in the context of a Bayesian explanation of cross-modal correspondences.

[1]  Robert L. Goldstone,et al.  Improving Perception to Make Distant Connections Closer , 2011, Front. Psychology.

[2]  L E Marks,et al.  On cross-modal similarity: auditory-visual interactions in speeded discrimination. , 1987, Journal of experimental psychology. Human perception and performance.

[3]  Elodie Labeye,et al.  The sensory nature of episodic memory: sensory priming effects due to memory trace activation. , 2009, Journal of experimental psychology. Learning, memory, and cognition.

[4]  V. Ramachandran,et al.  Synaesthesia? A window into perception, thought and language , 2001 .

[5]  J. Fodor The Modularity of mind. An essay on faculty psychology , 1986 .

[6]  Martin Meyer,et al.  Hemodynamic responses in human multisensory and auditory association cortex to purely visual stimulation , 2007, BMC Neuroscience.

[7]  B. Hommel,et al.  Feature integration across multimodal perception and action: a review. , 2013, Multisensory research.

[8]  Bernhard Hommel,et al.  Feature Integration across Multimodal Perception and Action: A Review , 2012 .

[9]  Katsumi Aoki,et al.  Recent development of flow visualization , 2004, J. Vis..

[10]  James R. Schmidt,et al.  Contingency learning with evaluative stimuli: testing the generality of contingency learning in a performance paradigm. , 2012, Experimental psychology.

[11]  J. Fodor,et al.  The Modularity of Mind: An Essay on Faculty Psychology , 1984 .

[12]  Bernhard Hommel,et al.  Temporal dynamics of unimodal and multimodal feature binding , 2010, Attention, perception & psychophysics.

[13]  A. King,et al.  Multisensory integration: perceptual grouping by eye and ear , 2001, Current Biology.

[14]  M. Ernst Learning to integrate arbitrary signals from vision and touch. , 2007, Journal of vision.

[15]  Charles Spence,et al.  ‘When Birds of a Feather Flock Together’: Synesthetic Correspondences Modulate Audiovisual Integration in Non-Synesthetes , 2009, PloS one.

[16]  Linda B. Smith,et al.  A developmental analysis of the polar structure of dimensions , 1992, Cognitive Psychology.

[17]  John J. Foxe,et al.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.

[18]  Elodie Labeye,et al.  The sensory nature of knowledge: Sensory priming effects in semantic categorization , 2010, Quarterly journal of experimental psychology.

[19]  S A Hillyard,et al.  An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. , 2002, Brain research. Cognitive brain research.

[20]  J. Fodor The Modularity of mind. An essay on faculty psychology , 1986 .

[21]  Charles Spence,et al.  Multisensory synesthetic interactions in the speeded classification of visual size , 2006, Perception & psychophysics.

[22]  Rémy Versace,et al.  Demonstration of an ebbinghaus illusion at a memory level: manipulation of the memory size and not the perceptual size. , 2014, Experimental psychology.

[23]  Derek Besner,et al.  Contingency learning and unlearning in the blink of an eye: A resource dependent process , 2010, Consciousness and Cognition.

[24]  Raphael M Barishansky,et al.  Birds of a feather flock together , 1998, Nature.

[25]  C. Spence,et al.  When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures , 2010, Cognition.

[26]  Stéphanie Dabic,et al.  "The mask who wasn't there": visual masking effect with the perceptual absence of the mask. , 2015, Journal of experimental psychology. Learning, memory, and cognition.

[27]  D. Medin,et al.  Birds of a Feather Flock Together: Similarity Judgments with Semantically Rich Stimuli , 1997 .

[28]  E. Bullmore,et al.  Activation of auditory cortex during silent lipreading. , 1997, Science.

[29]  C. Spence Crossmodal correspondences: A tutorial review , 2011, Attention, perception & psychophysics.

[30]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[31]  Bernhard Hommel,et al.  The relationship between feature binding and consciousness: Evidence from asynchronous multi-modal stimuli , 2011, Consciousness and Cognition.

[32]  Robert L. Goldstone,et al.  When seeing a dog activates the bark: multisensory generalization and distinctiveness effects. , 2013, Experimental psychology.

[33]  R. Versace,et al.  Act-In: An integrated view of memory mechanisms , 2014 .

[34]  Stanislas Dehaene,et al.  Toward a computational theory of conscious processing , 2014, Current Opinion in Neurobiology.

[35]  Robert L. Goldstone Unitization during category learning. , 2000, Journal of experimental psychology. Human perception and performance.

[36]  Rémy Versace,et al.  Perceptual processing affects the reactivation of a sensory dimension during a categorization task , 2015, Quarterly journal of experimental psychology.

[37]  A. Macleod,et al.  A procedure for measuring auditory and audio-visual speech-reception thresholds for sentences in noise: rationale, evaluation, and recommendations for use. , 1990, British journal of audiology.

[38]  Matthew Flatt,et al.  PsyScope: An interactive graphic system for designing and controlling experiments in the psychology laboratory using Macintosh computers , 1993 .

[39]  Robert L. Goldstone,et al.  Altering object representations through category learning , 2001, Cognition.

[40]  Nao Ninomiya,et al.  The 10th anniversary of journal of visualization , 2007, J. Vis..

[41]  M. Werning,et al.  Synesthesia, Sensory-Motor Contingency, and Semantic Emulation: How Swimming Style-Color Synesthesia Challenges the Traditional View of Synesthesia , 2012, Front. Psychology.

[42]  M. Wallace,et al.  Enhanced multisensory integration in older adults , 2006, Neurobiology of Aging.

[43]  Charles Spence,et al.  Does crossmodal correspondence modulate the facilitatory effect of auditory cues on visual search? , 2012, Attention, Perception, & Psychophysics.

[44]  B. Stein,et al.  The Merging of the Senses , 1993 .

[45]  Anne Treisman,et al.  Natural cross-modal mappings between visual and auditory features. , 2011, Journal of vision.

[46]  Steven A. Hillyard,et al.  Effects of Spatial Congruity on Audio-Visual Multimodal Integration , 2005, Journal of Cognitive Neuroscience.