A wearable sensor network for the control of virtual characters

A wearable sensor network is developed to track the human joint motion with high accuracy. One sensor unit consists of an accelerometer and a miniature linear sensor. We design one motion-based interface based on the wearable sensor network, in which users can control the virtual characters in both a video game and an online virtual environment (Second Life) by moving their wrists and knees. Based on the analysis of the situation awareness and the failure of task during the game playing, the results of the task performance on controlling the virtual characters indicated that the motion-based interface provided better control than the keyboard interface.

[1]  Sharif Razzaque,et al.  Redirected Walking in Place , 2002, EGVE.

[2]  Henry Been-Lirn Duh,et al.  A low-cost motion tracker and its error analysis , 2008, 2008 IEEE International Conference on Robotics and Automation.

[3]  Joseph A. Paradiso,et al.  A compact, high-speed, wearable sensor network for biomotion capture and interactive media , 2007, IPSN.

[4]  K. Aminian,et al.  Evaluation of an ambulatory system for gait analysis in hip osteoarthritis and after total hip replacement. , 2004, Gait & posture.

[5]  Dinesh K. Pai,et al.  FootSee: an interactive animation system , 2003, SCA '03.

[6]  Zhaoying Zhou,et al.  A real-time articulated human motion tracking using tri-axis inertial/magnetic sensors package. , 2004, IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[7]  Michael Zyda,et al.  Design and implementation of MARG sensors for 3-DOF orientation measurement of rigid bodies , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[8]  Paul A. Beardsley,et al.  Computer Vision for Interactive Computer Graphics , 1998, IEEE Computer Graphics and Applications.

[9]  Greg Welch,et al.  Motion Tracking: No Silver Bullet, but a Respectable Arsenal , 2002, IEEE Computer Graphics and Applications.

[10]  Patricia S. Denbrook,et al.  Virtual Locomotion: Walking in Place through Virtual Environments , 1999, Presence.

[11]  Tyler T Prevett,et al.  Exploring the dimensions of egocentricity in aircraft navigation displays , 1995 .

[12]  Eric Foxlin,et al.  Motion Tracking Requirements and Technologies , 2002 .

[13]  John C. Hart,et al.  The CAVE: audio visual experience automatic virtual environment , 1992, CACM.