Eye tracking for locomotion prediction in redirected walking

Model predictive control was shown to be a powerful tool for Redirected Walking when used to plan and select future redirection techniques. However, to use it effectively, a good prediction of the user's future actions is crucial. Traditionally, this prediction is made based on the user's position or current direction of movement. In the area of cognitive sciences however, it was shown that a person's gaze can also be highly indicative of his intention in both selection and navigation tasks. In this paper, this effect is used the first time to predict a user's locomotion target during goal-directed locomotion in an immersive virtual environment. After discussing the general implications and challenges of using eye tracking for prediction in a locomotion context, we propose a prediction method for a user's intended locomotion target. This approach is then compared with position based approaches in terms of prediction time and accuracy based on data gathered in an experiment. The results show that, in certain situations, eye tracking allows an earlier prediction compared approaches currently used for redirected walking. However, other recently published prediction methods that are based on the user's position perform almost as well as the eye tracking based approaches presented in this paper.

[1]  Timothy P. McNamara,et al.  Updating orientation in large virtual environments using scaled translational gain , 2006, APGV '06.

[2]  A. Miyake,et al.  The Cambridge Handbook of Visuospatial Thinking , 2005 .

[3]  Shu Yun Chung,et al.  Goal-directed pedestrian model for long-term motion prediction with application to robot motion planning , 2008, 2008 IEEE Workshop on Advanced robotics and Its Social Impacts.

[4]  Hema Swetha Koppula,et al.  Know Before You Do: Anticipating Maneuvers via Learning Temporal Driving Models , 2015, ArXiv.

[5]  Uwe D. Hanebeck,et al.  Telepresence Techniques for Controlling Avatar Motion in First Person Games , 2005, INTETAIN.

[6]  Mary M Hayhoe,et al.  Task and context determine where you look. , 2016, Journal of vision.

[7]  Philip W. Fink,et al.  Obstacle avoidance during walking in real and virtual environments , 2007, TAP.

[8]  A. Berthoz,et al.  Eye-head coordination for the steering of locomotion in humans: an anticipatory synergy , 1998, Neuroscience Letters.

[9]  Andreas M. Kunz,et al.  Using Head Tracking Data for Robust Short Term Path Prediction of Human Locomotion , 2013, Trans. Comput. Sci..

[10]  Jianbo Su,et al.  Motion Compression for Telepresence Locomotion , 2007, PRESENCE: Teleoperators and Virtual Environments.

[11]  M. Land Eye movements and the control of actions in everyday life , 2006, Progress in Retinal and Eye Research.

[12]  Uwe D. Hanebeck,et al.  Motion Compression for Telepresent Walking in Large Target Environments , 2004, Presence: Teleoperators & Virtual Environments.

[13]  Andreas M. Kunz,et al.  Planning redirection techniques for optimal free walking experience using model predictive control , 2014, 2014 IEEE Symposium on 3D User Interfaces (3DUI).

[14]  Timothy P. McNamara,et al.  Exploring large virtual environments with an HMD when physical space is limited , 2007, APGV.

[15]  Timo Ropinski,et al.  Moving Towards Generally Applicable Redirected Walking , 2008 .

[16]  A. Patla,et al.  “Look where you’re going!”: gaze behaviour associated with maintaining and changing the direction of locomotion , 2002, Experimental Brain Research.

[17]  Mary C. Whitton,et al.  Evaluation of Reorientation Techniques for Walking in Large Virtual Environments , 2008, 2008 IEEE Virtual Reality Conference.

[18]  Victoria Interrante,et al.  Seven League Boots: A New Metaphor for Augmented Locomotion through Moderately Large Scale Immersive Virtual Environments , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[19]  E. Hodgson,et al.  Optimizing Constrained-Environment Redirected Walking Instructions Using Search Techniques , 2013, IEEE Transactions on Visualization and Computer Graphics.

[20]  Simon J. Büchner,et al.  Gaze behaviour during space perception and spatial decision making , 2011, Psychological Research.

[21]  Sharif Razzaque,et al.  Redirected Walking , 2001, Eurographics.

[22]  Andreas M. Kunz,et al.  Using Locomotion Models for Estimating Walking Targets in Immersive Virtual Environments , 2015, 2015 International Conference on Cyberworlds (CW).

[23]  Gustavo Arechavaleta Servin An optimality principle governing human walking , 2007 .

[24]  Dong-Soo Kwon,et al.  Recognizing human intentional actions from the relative movements between human and robot , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[25]  David N. Lee,et al.  Where we look when we steer , 1994, Nature.

[26]  Richard Dewhurst,et al.  Using eye-tracking to trace a cognitive process: Gaze behavior during decision making in a natural environment , 2013 .

[27]  Klaus H. Hinrichs,et al.  Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback , 2008, 2008 International Conference on Cyberworlds.

[28]  Robert Riener,et al.  Predicting Targets of Human Reaching Motions Using Different Sensing Technologies , 2013, IEEE Transactions on Biomedical Engineering.

[29]  Jacob L. Orquin,et al.  Attention and choice: a review on eye movements in decision making. , 2013, Acta psychologica.

[30]  Christoph Hölscher,et al.  Do you have to look where you go? Gaze behaviour during spatial decision making , 2011, CogSci.