The effects of visual information about self-movement on grasp forces when receiving objects in an augmented environment

This work explored how the presence of visual information about self-movement affected grasp forces when receiving an object from a partner. Twelve subjects either reached to grasp or grasped without reaching objects that were passed by a partner or rested on a table surface. Visual feedback about self-movement was available for half the trials and was removed for the other half. Results indicated that a graphic representation of self-movement significantly decreased transfer time when objects were passed between subjects. Results also indicated decreased time to peak grip force and peak grip force rate by the receiver with this visual feedback. These results suggest that grip force production on objects acquired from another person benefit from a crude graphical representation of the finger pads. Furthermore, these results suggest that sources of sensory feedback cannot be studied in isolation. Instead we must consider how feedback modalities are integrated for successful interaction. Implications for the design of virtual environments and integrated feedback devices are discussed.

[1]  Colin Swindells,et al.  System lag tests for augmented and virtual environments , 2000, UIST '00.

[2]  Kellogg S. Booth,et al.  Calibration for augmented reality experimental testbeds , 1999, SI3D.

[3]  R. Johansson,et al.  Signals in tactile afferents from the fingers eliciting adaptive motor responses during precision grip , 2004, Experimental Brain Research.

[4]  M. A. Goodale,et al.  Factors affecting higher-order movement planning: a kinematic analysis of human prehension , 2004, Experimental Brain Research.

[5]  Susan J. Lederman,et al.  Extracting object properties through haptic exploration. , 1993, Acta psychologica.

[6]  Robert Sessions Woodworth,et al.  THE ACCURACY OF VOLUNTARY MOVEMENT , 1899 .

[7]  Matthew Heath,et al.  The effect of a pictorial illusion on closed-loop and open-loop prehension , 2000, Experimental Brain Research.

[8]  R. Johansson,et al.  Visual size cues in the programming of manipulative forces during precision grip , 2004, Experimental Brain Research.

[9]  Ilona Heldal,et al.  The collaborative cube puzzle: a comparison of virtual and real environments , 2000, CVE '00.

[10]  C J Ivens,et al.  Worst-Case Prediction Strategy in Force Programming When Visual Information is Obstructed , 2001, Perceptual and motor skills.

[11]  S J Lederman,et al.  Manipulation with no or partial vision. , 1999, Journal of experimental psychology. Human perception and performance.

[12]  Andrea H. Mason,et al.  Reaching movements to augmented and graphic objects in virtual environments , 2001, CHI.

[13]  Hunter G. Hoffman,et al.  Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[14]  Regan Lee Mandryk,et al.  USING THE FINGER FOR INTERACTION IN VIRTUAL ENVIRONMENTS , 2000 .

[15]  D. Elliott,et al.  The influence of visual target and limb information on manual aiming. , 1988, Canadian journal of psychology.

[16]  K. E. MacLean,et al.  The “Haptic Camera”: A Technique for Characterizing and Playing Back Haptic Properties of Real Environments , 1996, Dynamic Systems and Control.

[17]  M. Jeannerod,et al.  Optimal response of eye and hand motor systems in pointing at a visual target , 1979, Biological Cybernetics.

[18]  Roland S. Johansson,et al.  Afferent Signals During Manipulative Tasks in Humans , 1991 .

[19]  Colin Ware,et al.  Eye-hand co-ordination with force feedback , 2000, CHI.

[20]  K. J. Cole,et al.  Grip force adjustments evoked by load force perturbations of a grasped object. , 1988, Journal of neurophysiology.