Multimodal integration of natural gaze behavior for intention recognition during object manipulation

Naturally gaze is used for visual perception of our environment and gaze movements are mainly controlled subconsciously. Forcing the user to consciously diverge from that natural gaze behavior for interaction purposes causes high cognitive workload and destroys information contained in natural gaze movements. Instead of proposing a new gaze-based interaction technique, we analyze natural gaze behavior during an object manipulation task and show ways how it can be used for intention recognition, which provides a universal basis for integrating gaze into multimodal interfaces for different applications. We propose a model for multimodal integration of natural gaze behavior and evaluate it for two different use cases, namely for improvement of robustness of other potentially noisy input cues and for the design of proactive interaction techniques.

[1]  Shumin Zhai,et al.  Hand eye coordination patterns in target selection , 2000, ETRA.

[2]  M. Stella Atkins,et al.  Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment , 2004, ETRA.

[3]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[4]  Michael F. Land,et al.  From eye movements to actions: how batsmen hit the ball , 2000, Nature Neuroscience.

[5]  M. Hayhoe,et al.  The coordination of eye, head, and hand movements in a natural task , 2001, Experimental Brain Research.

[6]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[7]  David N. Lee,et al.  Where we look when we steer , 1994, Nature.

[8]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[9]  Jock D. Mackinlay,et al.  The design space of input devices , 1990, CHI '90.

[10]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[11]  Giovanni Ottoboni,et al.  Human gaze behaviour during action execution and observation. , 2008, Acta psychologica.

[12]  Jürgen Beyerer,et al.  Fast Invariant Contour-Based Classification of Hand Symbols for HCI , 2009, CAIP.

[13]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[14]  R. Johansson,et al.  Action plans used in action observation , 2003, Nature.