A haptic interface for the explorable virtual human

Haptic technology enhances a graphic user interface by allowing the user to interact in three dimensions with models on the screen, with force feedback giving the perception of touch. Recently several groups developed Internet based haptic systems that allow users in different parts of the world to interact with the same models simultaneously [Hikichi et al. 2002; Peterson 2002]. Although this system operates in real time, delay jitter caused by time required to transfer data inhibits the ability to apply realistic qualities to the individual models, as the lag in time is compensated by increasing the apparent weight of the model. Therefore, those users with greater lag times perceive the models as heavier than those with smaller lags [Hikichi et al. 2002]. While the required refresh rate of 1kHz suggests at least a portion of the interaction must be accomplished locally even on these other systems, transform updates must be handled at a server level. The haptic interface for the Explorable Virtual Human (EVH) we present allows the user to interact with various models taken from the Visual Human data set and the 100 micron knee. Our approach implements the PHANToM haptic device and can either incorporate single point collision as determined by the General Haptic Open Software Toolkit (GHOST®) SDK or calculate collision and force feedback information based on the shape and size of the haptic cursor and various properties of the models being viewed. In addition, collision detection allows deformations for those models representing soft tissues. Because the interaction occurs in real time, both the graphics and haptics run on the local computer while the browser enables the large data set to be managed by the server.