Head movement estimation for wearable eye tracker

In the study of eye movements in natural tasks, where subjects are able to freely move in their environment, it is desirable to capture a video of the surroundings of the subject not limited to a small field of view as obtained by the scene camera of an eye tracker. Moreover, recovering the head movements could give additional information about the type of eye movement that was carried out, the overall gaze change in world coordinates, and insight into high-order perceptual strategies. Algorithms for the classification of eye movements in such natural tasks could also benefit form the additional head movement data.We propose to use an omnidirectional vision sensor consisting of a small CCD video camera and a hyperbolic mirror. The camera is mounted on an ASL eye tracker and records an image sequence at 60 Hz. Several algorithms for the extraction of rotational motion from this image sequence were implemented and compared in their performance against the measurements of a Fasttrack magnetic tracking system. Using data from the eye tracker together with the data obtained by the omnidirectional image sensor, a new algorithm for the classification of different types of eye movements based on a Hidden-Markov-Model was developed.

[1]  Kostas Daniilidis,et al.  Direct 3D-rotation estimation from spherical images via a generalized shift theorem , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[2]  Tom,et al.  Epipolar Geometry for Panoramic Cameras Epipolar Geometry for Panoramic Cameras ? , 1998 .

[3]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[4]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[5]  Shree K. Nayar,et al.  A Theory of Single-Viewpoint Catadioptric Image Formation , 1999, International Journal of Computer Vision.

[6]  Václav Hlaváč,et al.  MOTION ESTIMATION USING CENTRAL PANORAMIC CAMERAS , 1998 .

[7]  Michal Irani,et al.  Recovery of ego-motion using image stabilization , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[8]  John R. Anderson,et al.  Mapping eye movements to cognitive processes , 1999 .

[9]  G. Chirikjian,et al.  Engineering Applications of Noncommutative Harmonic Analysis: With Emphasis on Rotation and Motion Groups , 2000 .

[10]  Thad Starner,et al.  Finding location using omnidirectional video on a wearable computing platform , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[11]  Shree K. Nayar,et al.  Ego-motion and omnidirectional cameras , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[12]  Christopher Geyer,et al.  Equivalence of catadioptric projections and mappings of the sphere , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[13]  Jochen Triesch,et al.  Vision in natural and virtual environments , 2002, ETRA.

[14]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[15]  O. Mimura [Eye movements]. , 1992, Nippon Ganka Gakkai zasshi.