Principles of haptic perception in virtual environments

During haptic interaction with everyday environments, haptic perception relies on sensory signals arising from mechanical signals such as contact forces, torques, movement of objects and limbs, mass or weight of objects, stiffness of materials, geometry of objects, etc. (Fig. 1a). In contrast, haptic perception in Virtual Environments (VEs) relies on sensory signals arising from computer-controlled mechanical signals produced by haptic interfaces (see Fig. 1b, the online animation [1] under Selected Readings and Websites, and [1, 2]). Haptic interfaces are programmable systems, which can reproduce mechanical signals that are normally experienced when haptically exploring real, everyday environments. Perhaps more importantly, haptic interfaces can create combinations of mechanical signals that do not have counterparts in real environments. This allows creating haptic VEs in which entirely new haptic sensory experiences are possible.

[1]  Susan J. Lederman,et al.  Neurobiology: Feeling bumps and holes , 2001, Nature.

[2]  Mark Wexler,et al.  Depth perception by the active observer , 2005, Trends in Cognitive Sciences.

[3]  J. F. Soechting,et al.  Approaches to the study of haptic sensing. , 2005, Journal of neurophysiology.