Full Body Locomotion with Video Game Motion Controllers

Sensing technologies of increasing fidelity are dropping in costs to the point that full body sensing hardware is commonplace in people’s homes. This presents an opportunity for users to interact and move through environments with their body. Not just walking or running, but jumping, dodging, looking, dancing and exploring. Three current generation videogame devices, the Nintendo Wii Remote, Playstation Move and Microsoft Kinect, are discussed in terms of their sensors and data, in order to explore two questions. First, how do you deal with the data from the devices including error, uncertainty and volume? Second, how do you use the devices to create an interface that allows the user to interact as they wish? While these devices will change in time, understanding the sensing methods and approach to interface design will act as a basis for further improvements to full body locomotion.

[1]  Sharon L. Oviatt,et al.  Mutual disambiguation of recognition errors in a multimodel architecture , 1999, CHI '99.

[2]  Joseph J. LaViola Bringing VR and Spatial 3D Interaction to the Masses through Video Games , 2008, IEEE Computer Graphics and Applications.

[3]  Helen Lewis,et al.  Continuing health care: the local development of policies and eligibility criteria. , 1999, Health & social care in the community.

[4]  Joseph J. LaViola,et al.  Exploring strategies and guidelines for developing full body video game interfaces , 2010, FDG.

[5]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[6]  Greg Welch,et al.  Motion Tracking: No Silver Bullet, but a Respectable Arsenal , 2002, IEEE Computer Graphics and Applications.

[7]  Toby Sharp,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR.

[8]  Jennifer G. Sheridan,et al.  Designing sports: a framework for exertion games , 2011, CHI.

[9]  Guna Seetharaman,et al.  A new technique for finding the optical center of cameras , 1998, Proceedings 1998 International Conference on Image Processing. ICIP98 (Cat. No.98CB36269).

[10]  Joseph J. LaViola,et al.  Breaking the status quo: Improving 3D gesture recognition with spatially convenient input devices , 2010, 2010 IEEE Virtual Reality Conference (VR).

[11]  Tim Roberts,et al.  Natural Full Body Interaction for Navigation in Dismounted Soldier Training , 2011 .

[12]  Taku Komura,et al.  Topology matching for fully automatic similarity estimation of 3D shapes , 2001, SIGGRAPH.

[13]  Bobby Bodenheimer,et al.  Synthesis and evaluation of linear motion transitions , 2008, TOGS.

[14]  Desney S. Tan,et al.  Humantenna: using the body as an antenna for real-time whole-body interaction , 2012, CHI.

[15]  Ronald Azuma,et al.  Improving static and dynamic registration in an optical see-through HMD , 1994, SIGGRAPH.

[16]  B. Andrews,et al.  Detecting absolute human knee angle and angular velocity using accelerometers and rate gyroscopes , 2001, Medical and Biological Engineering and Computing.

[17]  Joseph J. LaViola,et al.  RealNav: Exploring natural user interfaces for locomotion in video games , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[18]  Jessica K. Hodgins,et al.  Accelerometer-based user interfaces for the control of a physically simulated character , 2008, SIGGRAPH 2008.

[19]  William R. Sherman,et al.  Understanding Virtual RealityInterface, Application, and Design , 2002, Presence: Teleoperators & Virtual Environments.

[20]  Jeffrey K. Uhlmann,et al.  New extension of the Kalman filter to nonlinear systems , 1997, Defense, Security, and Sensing.

[21]  Emiko Charbonneau,et al.  The Wiimote and Beyond: Spatially Convenient Devices for 3D User Interfaces , 2010, IEEE Computer Graphics and Applications.

[22]  P. Veltink,et al.  Estimating orientation with gyroscopes and accelerometers. , 1999, Technology and health care : official journal of the European Society for Engineering and Medicine.

[23]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[24]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[25]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[26]  F. Markley,et al.  Unscented Filtering for Spacecraft Attitude Estimation , 2003 .

[27]  Bill Buxton,et al.  Sketching User Experiences: Getting the Design Right and the Right Design , 2007 .

[28]  Simon Richir,et al.  WiiMedia: motion analysis methods and applications using a consumer video game controller , 2007, Sandbox '07.

[29]  Sharif Razzaque,et al.  Comparing VE locomotion interfaces , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[30]  R. E. Kalman,et al.  A New Approach to Linear Filtering and Prediction Problems , 2002 .

[31]  Rudolph van der Merwe,et al.  The unscented Kalman filter for nonlinear estimation , 2000, Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373).

[32]  RubineDean Specifying gestures by example , 1991 .