Using point cloud data to improve three dimensional gaze estimation

This paper addresses the problem of estimating gaze location in the 3D environment using a remote eye tracker. Instead of relying only on data provided by the eye tracker, we investigate how to integrate gaze direction with the point-cloud-based representation of the scene provided by a Kinect sensor. The algorithm first combines the gaze vectors for the two eyes provided by the eye tracker into a single gaze vector emanating from a point in between the two eyes. The gaze target in the three dimensional environment is then identified by finding the point in the 3D point cloud that is closest to the gaze vector. Our experimental results demonstrate that the estimate of the gaze target location provided by this method is significantly better than that provided when considering gaze information alone. It is also better than two other methods for integrating point cloud information: (1) finding the 3D point closest to the gaze location as estimated by triangulating the gaze vectors from the two eyes, and (2) finding the 3D point with smallest average distance to the two gaze vectors considered individually. The proposed method has an average error of 1.7 cm in a workspace of 25 × 23 × 24 cm located at a distance of 60 cm from the user.

[1]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.

[2]  Peter D. Lawrence,et al.  Noncontact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions , 2009, IEEE Transactions on Biomedical Engineering.

[3]  William W. Abbott,et al.  Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[4]  M. Bergamasco,et al.  A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for Rehabilitation in Real-World Tasks , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[5]  Nitish V. Thakor,et al.  Demonstration of a Semi-Autonomous Hybrid Brain–Machine Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  Bertram E. Shi,et al.  Hybrid Brain Computer Interface via Bayesian integration of EEG and eye gaze , 2015, 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER).

[7]  Craig A. Hennessey Point-of-gaze estimation in three dimensions , 2008 .

[8]  H D Crane,et al.  Generation-V dual-Purkinje-image eyetracker. , 1985, Applied optics.

[9]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[10]  A. Aldo Faisal,et al.  A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving , 2015, 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER).

[11]  Andreas Geiger,et al.  Automatic camera and range sensor calibration using a single shot , 2012, 2012 IEEE International Conference on Robotics and Automation.

[12]  Chern-Sheng Lin,et al.  Powered Wheelchair Controlled by Eye-Tracking System , 2006 .

[13]  Helge J. Ritter,et al.  A neural network for 3D gaze recording with binocular eye trackers , 2006, Int. J. Parallel Emergent Distributed Syst..

[14]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[15]  Bertram E. Shi,et al.  Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task , 2015, 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[16]  William W. Abbott,et al.  3D gaze cursor: Continuous calibration and end-point grasp control of robotic actuators , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).