Bayes-Like Integration of a New Sensory Skill with Vision

Humans are effective at dealing with noisy, probabilistic information in familiar settings. One hallmark of this is Bayesian Cue Combination: combining multiple noisy estimates to increase precision beyond the best single estimate, taking into account their reliabilities. Here we show that adults also combine a novel audio cue to distance, akin to human echolocation, with a visual cue. Following two hours of training, subjects were more precise given both cues together versus the best single cue. This persisted when we changed the novel cue’s auditory frequency. Reliability changes also led to a re-weighting of cues without feedback, showing that they learned something more flexible than a rote decision rule for specific stimuli. The main findings replicated with a vibrotactile cue. These results show that the mature sensory apparatus can learn to flexibly integrate new sensory skills. The findings are unexpected considering previous empirical results and current models of multisensory learning.

[1]  O. Braddick,et al.  Seeing in Depth , 2008 .

[2]  James M. Hillis,et al.  Slant from texture and disparity cues: optimal cue combination. , 2004, Journal of vision.

[3]  Ulrik R. Beierholm,et al.  Causal inference in perception , 2010, Trends in Cognitive Sciences.

[4]  Marko Nardini,et al.  Late Development of Cue Integration Is Linked to Sensory Fusion in Cortex , 2015, Current Biology.

[5]  David J. Lewkowicz,et al.  Cross-modal equivalence in early infancy: Auditory–visual intensity matching. , 1980 .

[6]  Brian C. J. Moore,et al.  A summary of research investigating echolocation abilities of blind and sighted humans , 2014, Hearing Research.

[7]  E. Spelke Perceiving Bimodally Specified Events in Infancy , 1979 .

[8]  M. N. Ahmadabadi,et al.  Reward Maximization Justifies the Transition from Sensory Selection at Childhood to Sensory Integration at Adulthood , 2014, PloS one.

[9]  Shachar Maidenbaum,et al.  The "EyeCane", a new electronic travel aid for the blind: Technology, behavior & swift learning. , 2014, Restorative neurology and neuroscience.

[10]  M. Nardini,et al.  When vision is not an option: children’s integration of auditory and haptic information is suboptimal , 2014, Developmental science.

[11]  M. Carandini,et al.  Normalization as a canonical neural computation , 2011, Nature Reviews Neuroscience.

[12]  L. Maloney,et al.  Bayesian decision theory as a model of human visual perception: Testing Bayesian transfer , 2009, Visual Neuroscience.

[13]  Herke Jan Noordmans,et al.  Auditory feedback during frameless image-guided surgery in a phantom model and initial clinical experience. , 2009, Journal of neurosurgery.

[14]  Peter König,et al.  Bayesian Alternation during Tactile Augmentation , 2016, Front. Behav. Neurosci..

[15]  Stefan Debener,et al.  Irrelevant visual stimuli improve auditory task performance , 2008, Neuroreport.

[16]  G. Sandini,et al.  Development of Visuo-Auditory Integration in Space and Time , 2012, Front. Integr. Neurosci..

[17]  David C. Burr,et al.  Young Children Do Not Integrate Visual and Haptic Form Information , 2008, Current Biology.

[18]  SUSAN A. ROSE,et al.  Cross‐Modal Transfer in Human Infants , 1990, Child development.

[19]  Shachar Maidenbaum,et al.  Author's Personal Copy Neuroscience and Biobehavioral Reviews Sensory Substitution: Closing the Gap between Basic Research and Widespread Practical Visual Rehabilitation Author's Personal Copy , 2022 .

[20]  M. Nardini,et al.  Fusion of visual cues is not mandatory in children , 2010, Proceedings of the National Academy of Sciences.

[21]  Shachar Maidenbaum,et al.  Navigation Using Sensory Substitution in Real and Virtual Mazes , 2015, PloS one.

[22]  Andrew Thomas,et al.  WinBUGS - A Bayesian modelling framework: Concepts, structure, and extensibility , 2000, Stat. Comput..

[23]  Ashutosh Kumar Singh,et al.  Global, regional, and national incidence, prevalence, and years lived with disability for 310 diseases and injuries, 1990–2015: a systematic analysis for the Global Burden of Disease Study 2015 , 2016, Lancet.

[24]  J. B. Pittenger,et al.  Human Echolocation as a Basic Form of Perception and Action , 1995 .

[25]  Lore Thaler,et al.  Echolocation in humans: an overview. , 2016, Wiley interdisciplinary reviews. Cognitive science.

[26]  D. Angelaki,et al.  A computational perspective on autism , 2015, Proceedings of the National Academy of Sciences.

[27]  Alan D. Lopez,et al.  The Global Burden of Disease Study , 2003 .

[28]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[29]  Konrad Paul Kording,et al.  Bayesian integration in sensorimotor learning , 2004, Nature.

[30]  Peter B. L. Meijer,et al.  An experimental system for auditory image representations , 1992, IEEE Transactions on Biomedical Engineering.

[31]  Monica Gori,et al.  Investigate echolocation with non-disabled individuals , 2017 .

[32]  J. Saunders,et al.  Do humans optimally integrate stereo and texture information for judgments of surface slant? , 2003, Vision Research.

[33]  Pete R. Jones,et al.  Development of Cue Integration in Human Navigation , 2008, Current Biology.

[34]  Mikhail Cherniakov,et al.  Human echolocation : waveform analysis of tongue clicks. , 2017 .

[35]  Santani Teng,et al.  The acuity of echolocation: Spatial resolution in the sighted compared to expert performance. , 2011, Journal of visual impairment & blindness.

[36]  R. Lickliter,et al.  Intersensory redundancy guides attentional selectivity and perceptual learning in infancy. , 2000, Developmental psychology.

[37]  D. Knill,et al.  The Bayesian brain: the role of uncertainty in neural coding and computation , 2004, Trends in Neurosciences.

[38]  Jitendra Malik,et al.  Learning to detect natural image boundaries using local brightness, color, and texture cues , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[39]  Thomas H. Weisswange,et al.  Bayesian Cue Integration as a Developmental Outcome of Reward Mediated Learning , 2011, PloS one.

[40]  Shachar Maidenbaum,et al.  EyeMusic: Introducing a "visual" colorful experience for the blind using auditory sensory substitution. , 2014, Restorative neurology and neuroscience.

[41]  Lore Thaler,et al.  Correlation between vividness of visual imagery and echolocation ability in sighted, echo-naïve people , 2014, Experimental Brain Research.

[42]  Dan J Stein,et al.  Global, regional, and national incidence, prevalence, and years lived with disability for 301 acute and chronic diseases and injuries in 188 countries, 1990–2013: a systematic analysis for the Global Burden of Disease Study 2013 , 2015, The Lancet.

[43]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[44]  A. Pouget,et al.  Probabilistic brains: knowns and unknowns , 2013, Nature Neuroscience.

[45]  D. Lewkowicz,et al.  The development of intersensory temporal perception: an epigenetic systems/limitations view. , 2000, Psychological bulletin.

[46]  Richard N. Aslin,et al.  Learning and inference using complex generative models in a spatial localization task , 2016, Journal of vision.

[47]  David J. Getty,et al.  Discrimination of short temporal intervals: A comparison of two models , 1975 .

[48]  Rachel N. Denison,et al.  Supra-optimality may emanate from suboptimality, and hence optimality is no benchmark in multisensory integration , 2018, Behavioral and Brain Sciences.

[49]  G. DeAngelis,et al.  A Normalization Model of Multisensory Integration , 2011, Nature Neuroscience.