Integration of vision and haptics during tool use.

When integrating signals from vision and haptics the brain must solve a "correspondence problem" so that it only combines information referring to the same object. An invariant spatial rule could be used when grasping with the hand: here the two signals should only be integrated when the estimate of hand and object position coincide. Tools complicate this relationship, however, because visual information about the object, and the location of the hand, are separated spatially. We show that when a simple tool is used to estimate size, the brain integrates visual and haptic information in a near-optimal fashion, even with a large spatial offset between the signals. Moreover, we show that an offset between the tool-tip and the object results in similar reductions in cross-modal integration as when the felt and seen positions of an object are offset in normal grasping. This suggests that during tool use the haptic signal is treated as coming from the tool-tip, not the hand. The brain therefore appears to combine visual and haptic information, not based on the spatial proximity of sensory stimuli, but based on the proximity of the distal causes of stimuli, taking into account the dynamics and geometry of tools.

[1]  A Farnè,et al.  Dynamic size-change of peri-hand space through tool-use: spatial extension or shift of the multi-sensory area. , 2007, Journal of neuropsychology.

[2]  M. Tanaka,et al.  Coding of modified body schema during tool use by macaque postcentral neurones. , 1996, Neuroreport.

[3]  M. Landy,et al.  Weighted linear cue combination with possibly correlated error , 2003, Vision Research.

[4]  A. Maravita,et al.  Tools for the body (schema) , 2004, Trends in Cognitive Sciences.

[5]  C. Spence,et al.  Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools , 2004, Neuroscience Letters.

[6]  Gordon M. Redding,et al.  Adaptive Spatial Alignment , 1997 .

[7]  D. Knill Robust cue integration: a Bayesian model and evidence from cue-conflict studies with stereoscopic and figure cues to slant. , 2007, Journal of vision.

[8]  C. V. Jackson,et al.  Visual Factors in Auditory Localization , 1953 .

[9]  Francescaromana Maradei,et al.  Measurement and Modeling , 2008 .

[10]  Marc O. Ernst,et al.  Amodal Multimodal Integration , 2008 .

[11]  A Farnè,et al.  Dynamic size‐change of hand peripersonal space following tool use , 2000, Neuroreport.

[12]  Marc O Ernst,et al.  Knowledge about a Common Source Can Promote Visual — Haptic Integration , 2007, Perception.

[13]  James M. Hillis,et al.  Slant from texture and disparity cues: optimal cue combination. , 2004, Journal of vision.

[14]  D. Knill,et al.  The Bayesian brain: the role of uncertainty in neural coding and computation , 2004, Trends in Neurosciences.

[15]  M. Shiffrar,et al.  Human Body Perception From The Inside Out , 2005 .

[16]  M. Landy,et al.  Measurement and modeling of depth cue combination: in defense of weak fusion , 1995, Vision Research.

[17]  Neil W. Roach,et al.  Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration , 2006, Proceedings of the Royal Society B: Biological Sciences.

[18]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[19]  H. A. Witkin,et al.  Sound localization with conflicting visual and auditory cues. , 1952, Journal of experimental psychology.

[20]  Ulrik R Beierholm,et al.  Sound-induced flash illusion as an optimal percept , 2005, Neuroreport.

[21]  J. Saunders,et al.  Do humans optimally integrate stereo and texture information for judgments of surface slant? , 2003, Vision Research.

[22]  A. Yuille,et al.  Bayesian decision theory and psychophysics , 1996 .

[23]  W. Prinz,et al.  Programming tool-use actions. , 2007, Journal of experimental psychology. Human perception and performance.

[24]  W. Richards,et al.  Perception as Bayesian Inference , 2008 .

[25]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[26]  D. H. Warren,et al.  Visual-proprioceptive interaction under large amounts of conflict. , 1971, Journal of experimental psychology.

[27]  M. Ernst Learning to integrate arbitrary signals from vision and touch. , 2007, Journal of vision.

[28]  Michael I. Jordan,et al.  An internal model for sensorimotor integration. , 1995, Science.

[29]  S. Gepshtein,et al.  The combination of vision and touch depends on spatial proximity. , 2005, Journal of vision.

[30]  S. Gepshtein,et al.  Viewing Geometry Determines How Vision and Haptics Combine in Size Perception , 2003, Current Biology.

[31]  James J. Clark,et al.  Data Fusion for Sensory Information Processing Systems , 1990 .

[32]  Jean-Pierre Bresciani,et al.  Vision and touch are automatically integrated for the perception of sequences of events. , 2006, Journal of vision.

[33]  Marc O. Ernst,et al.  A Bayesian view on multimodal cue integration , 2006 .