Evaluating 3D gaze tracking in virtual space: A computer graphics approach

Abstract Increasing usage of stereoscopic 3D technology in virtual reality, video games, entertainment, and visualization has risen concern on development of gaze-based interaction. To develop intuitive and accurate gaze-based interaction, estimation of 3D gaze in virtual space should be validated experimentally. 3D gaze tracking is generally performed in real space with rigid object as validation target. Thus, researchers in virtual reality are constrained on choosing appropriate evaluation method when 3D gaze tracking has to be performed in virtual space. To fill this research gap, we present design and development of a new evaluation method for 3D gaze tracking in virtual space. We have implemented computer graphics technology to develop virtual plane containing virtual 3D object as validation target. Experimental results show that the proposed evaluation method was able to support real experiment by proving the accuracy of our 3D gaze tracking system with average Euclidean error less than 1 cm (Mean = 0.95 cm; S.D = 0.55 cm) in 74 cm depth of workspace. Compared with evaluation method for 3D gaze tracking in real space, the proposed method even can be implemented when space of experiment room is limited by adjusting the distance of virtual plane programmatically.

[1]  Kazuhiko Hamamoto,et al.  3D gaze tracking system for NVidia 3D Vision® , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[2]  Alessandro Rudi,et al.  A general method for the point of regard estimation in 3D space , 2011, CVPR 2011.

[3]  Jeff B. Pelz,et al.  3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker , 2008, ETRA '08.

[4]  Damien Maupu,et al.  3D stereo interactive medical visualization , 2005, IEEE Computer Graphics and Applications.

[5]  Kazuhiko Hamamoto,et al.  ERROR CORRECTION IN GEOMETRIC METHOD OF 3 D GAZE MEASUREMENT USING SINGULAR VALUE DECOMPOSITION , 2012 .

[6]  C. Shawn Green,et al.  Visual 3D motion acuity predicts discomfort in 3D stereoscopic environments , 2016, Entertain. Comput..

[7]  Manfred Tscheligi,et al.  Viewing experience of 3DTV: An exploration of the feeling of sickness and presence in a shopping mall , 2013, Entertain. Comput..

[8]  R. G. Belleman,et al.  Immersive Virtual Reality on commodity hardware , 2001 .

[9]  Kazuhiko Hamamoto,et al.  Design and implementation of gaze tracking headgear for Nvidia 3D Vision® , 2013, 2013 International Conference on Information Technology and Electrical Engineering (ICITEE).

[10]  Kazuhiko Hamamoto,et al.  A Geometric Model for Measuring Depth Perception in Immersive Virtual Environment , 2012 .

[11]  Min Young Kim,et al.  Head-mounted binocular gaze detection for selective visual recognition systems , 2012 .

[12]  A. Gargantini,et al.  Using 3D for rebalancing the visual system of amblyopic children , 2011, 2011 International Conference on Virtual Rehabilitation.

[13]  Óscar Mealha,et al.  Video game scenery analysis with eye tracking , 2016, Entertain. Comput..

[14]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[15]  William R. Sherman,et al.  Understanding Virtual Reality , 2003 .

[16]  Hongen Liao 3D Medical Imaging and Augmented Reality for Image-Guided Surgery , 2011, Handbook of Augmented Reality.

[17]  Sheng-Wen Shih,et al.  A novel approach to 3-D gaze tracking using stereo cameras , 2004, IEEE Trans. Syst. Man Cybern. Part B.

[18]  Supan Tungjitkusolmun,et al.  Dual‐camera acquisition for accurate measurement of three‐dimensional eye movements , 2013 .

[19]  Masatsugu Kidode,et al.  Estimation of 3D gazed position using view lines , 2003, 12th International Conference on Image Analysis and Processing, 2003.Proceedings..

[20]  Hironobu Fujiyoshi,et al.  Acquisition of 3D gaze information from eyeball movements using inside-out camera , 2011, AH '11.

[21]  Peter D. Lawrence,et al.  Noncontact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions , 2009, IEEE Transactions on Biomedical Engineering.

[22]  William R. Sherman,et al.  Understanding Virtual RealityInterface, Application, and Design , 2002, Presence: Teleoperators & Virtual Environments.

[23]  Kazuhiko Hamamoto,et al.  DUAL CAMERAS ACQUISITION FOR THREE-DIMENSIONAL EYE-MOTION TRACKING , 2010 .

[24]  Raimund Dachselt,et al.  LAIF: A logging and interaction framework for gaze-based interfaces in virtual entertainment environments , 2011, Entertain. Comput..