Developing articulated human models from laser scan data for use as avatars in real time networked virtual environments

Abstract : With the continuing gain in computing power, bandwidth, and Internet popularity there is a growing interest in Internet communities. To participate in these communities, people need virtual representations of their bodies, called avatars. Creation and rendering of realistic personalized avatars for use as virtual body representations is often too complex for real-time applications such as networked virtual environment (VE). Virtual Environment (VE) designers have had to settle for unbelievable, simplistic avatars and constrain avatar motion to a few discrete positions. The approach taken in this thesis is to use a full-body laser-scanning process to capture human body surface anatomical information accurate to the scale of millimeters. Using this 3D data, virtual representations of the original human model can be simplified, constructed and placed in a networked virtual environment. The result of this work is to provide photo realistic avatars that are efficiently rendered in real-time networked virtual environments. The avatar is built in the Virtual Reality Modeling Language (VRML). Avatar motion can be controlled either with scripted behaviors using the H-Anim specification or via wireless body tracking sensors developed at the Naval Postgraduate School. Live 3D visualization of animated humanoids is viewed in freely available web browsers.