3D attention: measurement of visual saliency using eye tracking glasses

Understanding and estimating human attention in different interactive scenarios is an important part of human computer interaction. With the advent of wearable eye-tracking glasses and Google glasses, monitoring of human visual attention will soon become ubiquitous. The presented work describes the precise estimation of human gaze fixations with respect to its environment, without the need of artificial landmarks in the field of view, and being capable of providing attention mapping onto 3D information. It enables full 3D recovery of the human view frustum and the gaze pointer in a previously acquired 3D model of the environment in real time. The key contribution is that our methodology enables mapping of fixations directly into an automatically computed 3d model. This innovative methodology will open new opportunities for human attention studies during interaction with its environment, bringing new potential into automated processing for human factors technologies.

[1]  V. Lepetit,et al.  EPnP: An Accurate O(n) Solution to the PnP Problem , 2009, International Journal of Computer Vision.

[2]  Kurt Konolige,et al.  Double window optimisation for constant time visual SLAM , 2011, 2011 International Conference on Computer Vision.

[3]  Javier Civera,et al.  Inverse Depth Parametrization for Monocular SLAM , 2008, IEEE Transactions on Robotics.

[4]  Alessandro Rudi,et al.  A general method for the point of regard estimation in 3D space , 2011, CVPR 2011.

[5]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[6]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[7]  Jeff B. Pelz,et al.  3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker , 2008, ETRA '08.

[8]  David Nistér,et al.  Scalable Recognition with a Vocabulary Tree , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[9]  Dinesh Manocha,et al.  OBBTree: a hierarchical structure for rapid interference detection , 1996, SIGGRAPH.

[10]  Horst Bischof,et al.  GPSlam: Marrying Sparse Geometric and Dense Probabilistic Visual Mapping , 2011, BMVC.

[11]  Elke E. Mattheiss,et al.  Attentional Behavior of Users on the Move Towards Pervasive Advertising Media , 2011, Pervasive Advertising.

[12]  Simon P. Liversedge Abstracts of the 15th European Conference on Eye Movements 2009 , 2009 .