Grasping at ‘thin air’: multimodal contact cues for reaching and grasping

Two experiments investigated the effects of haptic, auditory and graphic contact cues on reaching to grasp augmented objects (physical and graphic) and virtual objects (graphic only) of various sizes. In Experiment 1, auditory contact cues were presented either to enhance or to replace natural haptic contact cues in grasping. In Experiment 2, graphic contact cues were presented alone or in combination with auditory cues, and were provided either to enhance or to replace haptic contact information. Visual information of the hand was not available. Experiment 1 showed that enhancing haptic contact information with redundant auditory cues (augmented object) led to faster movement times than haptic cues alone. When haptic information was not available (virtual object), it could be replaced to some extent by auditory contact cues. In Experiment 2 movement times were fastest when both auditory and graphic cues were provided, and slowest when no contact cues were provided. Further, movement times were scaled to target width when reaching to grasp augmented objects, thus following Fitts’ law. In contrast, movement times showed a less pronounced decrease with increasing object size for virtual objects. However, even in the absence of haptic information, movement times showed a more pronounced scaling to object size when auditory contact cues were provided. These results emphasize the importance of contact information, especially haptic and auditory information, for planning and control of reaching and grasping.

[1]  Kenneth R. Boff,et al.  Sensory processes and perception , 1986 .

[2]  G. Fullerton Psychology and physiology. , 1896 .

[3]  Z. Pylyshyn,et al.  Vision and Action: The Control of Grasping , 1990 .

[4]  D. Rosenbaum,et al.  Coordination of reaching and grasping by capitalizing on obstacle avoidance and other constraints , 1999, Experimental Brain Research.

[5]  Christine L. MacKenzie,et al.  Auditory, graphical and haptic contact cues for a reach, grasp, and place task in an augmented environment , 2003, ICMI '03.

[6]  S J Lederman,et al.  Cognitive Salience of Haptic Object Properties: Role of Modality-Encoding Bias , 1996, Perception.

[7]  V. Jousmäki,et al.  Parchment-skin illusion: sound-biased touch , 1998, Current Biology.

[8]  A. Kingstone,et al.  Auditory capture of vision: examining temporal ventriloquism. , 2003, Brain research. Cognitive brain research.

[9]  R. Sekuler,et al.  Sound alters visual motion perception , 1997, Nature.

[10]  B. Stein,et al.  Multisensory integration. Neural and behavioral solutions for dealing with stimuli from different sensory modalities. , 1990, Annals of the New York Academy of Sciences.

[11]  P. Fitts,et al.  INFORMATION CAPACITY OF DISCRETE MOTOR RESPONSES. , 1964, Journal of experimental psychology.

[12]  S. Lederman Auditory Texture Perception , 1979, Perception.

[13]  M A Meredith,et al.  Multisensory Integration , 1990 .

[14]  M Gentilucci,et al.  Haptic information differentially interferes with visual analysis in reaching‐grasping control and in perceptual processes , 1998, Neuroreport.

[15]  Andrea H. Mason,et al.  Reaching movements to augmented and graphic objects in virtual environments , 2001, CHI.

[16]  S. Shimojo,et al.  Illusions: What you see is what you hear , 2000, Nature.

[17]  Johanna D. Moore,et al.  Proceedings of the Conference on Human Factors in Computing Systems , 1989 .

[18]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[19]  Mandayam A. Srinivasan,et al.  The Effect of Auditory Cues on the Haptic Perception of Stiffness in Virtual Environments , 1997, Dynamic Systems and Control.

[20]  D. Wolpert,et al.  Motor prediction , 2001, Current Biology.

[21]  C. MacKenzie,et al.  Three-Dimensional Movement Trajectories in Fitts' Task: Implications for Control , 1987 .

[22]  M. Jeannerod Intersegmental coordination during reaching at natural visual objects , 1981 .

[23]  B. Bergum,et al.  Attention and performance IX , 1982 .

[24]  M. Jeannerod The timing of natural prehension movements. , 1984, Journal of motor behavior.

[25]  Robert Sessions Woodworth,et al.  THE ACCURACY OF VOLUNTARY MOVEMENT , 1899 .

[26]  M A Arbib,et al.  Schemas for the temporal organization of behaviour. , 1985, Human neurobiology.

[27]  Dinesh K. Pai,et al.  Perception of Material from Contact Sounds , 2000, Presence: Teleoperators & Virtual Environments.

[28]  Nathaniel I. Durlach,et al.  Virtual Reality: Scientific and Technological Challenges , 1994 .

[29]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[30]  R. S. Johansson,et al.  Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects , 2004, Experimental Brain Research.

[31]  M. Posner,et al.  Visual dominance: an information-processing account of its origins and significance. , 1976, Psychological review.

[32]  M. Ernst,et al.  Feeling what you hear: auditory signals can modulate tactile tap perception , 2005, Experimental Brain Research.

[33]  R. Klatzky,et al.  The Intelligent Hand , 1988 .

[34]  R. Klatzky,et al.  There's more to touch than meets the eye: The salience of object attributes for haptics with and without vision. , 1987 .

[35]  R. Johansson,et al.  Factors influencing the force control during precision grip , 2004, Experimental Brain Research.

[36]  Susan J. Lederman,et al.  Integrating multimodal information about surface texture via a probe: relative contributions of haptic and touch-produced sound sources , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[37]  Susan J. Lederman,et al.  Relative performance using haptic and/or touch-produced auditory cues in a remote absolute texture identification task , 2003, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings..

[38]  Christine L. MacKenzie,et al.  Physical versus virtual pointing , 1996, CHI.

[39]  L. Jakobson,et al.  Differences in the visual control of pantomimed and natural grasping movements , 1994, Neuropsychologia.

[40]  C. Spence,et al.  Audiotactile interactions in roughness perception , 2002, Experimental Brain Research.

[41]  S. Vogt,et al.  Multijoint grasping movements , 2001, Experimental Brain Research.

[42]  Umberto Castiello,et al.  A cross-modal interference effect in grasping objects , 2003, Psychonomic bulletin & review.

[43]  Christine L. MacKenzie,et al.  Functional relationships between grasp and transport components in a prehension task , 1990 .

[44]  M. Jeannerod,et al.  Constraints on human arm movement trajectories. , 1987, Canadian journal of psychology.

[45]  Daniel M Wolpert,et al.  Role of uncertainty in sensorimotor control. , 2002, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[46]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.