Viewing direction estimation based on 3D eyeball construction for HRI

Natural human-robot interaction requires leveraging viewing direction information in order to recognize, respond to, and even emulate human behavior. Knowledge of the eye gaze and point of regard gives us insight into what the subject is interested in and/or who the subject is addressing. In this paper, we present a novel eye gaze estimation approach for point-of-regard (PoG) tracking. To allow for greater head pose freedom, we introduce a new calibration approach to find the 3D eyeball location, eyeball radius, and fovea position. To estimate gaze direction, we map both the iris center and iris contour points to the eyeball sphere (creating a 3D iris disk), giving us the optical axis. We then rotate the fovea accordingly and compute our final, visual axis gaze direction. Our intention is to integrate this eye gaze approach with a dual-camera system we have developed that detects the face and eyes from a fixed, wide-angle camera and directs another active pan-tilt-zoom camera to focus in on this eye region. The final system will permit natural, non-intrusive, pose-invariant PoG estimation in distance and allow user translational freedom without resorting to infrared equipment or complex hardware setups.

[1]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[2]  Lambert E. Wixson,et al.  Using intermediate objects to improve the efficiency of visual search , 1994, International Journal of Computer Vision.

[3]  Zongcai Ruan,et al.  IR Image Based Eye Gaze Estimation , 2007, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007).

[4]  Qian Chen,et al.  Tracking Iris Contour with a 3D Eye-Model for Gaze Estimation , 2007, ACCV.

[5]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Vladimir Vezhnevets,et al.  Robust and Accurate Eye Contour Extraction , 2003 .

[7]  Jian-Gang Wang,et al.  Gaze determination via images of irises , 2001, Image Vis. Comput..

[8]  Xueyin Lin,et al.  Gaze Direction Estimation Based on Natural Head Movements , 2007, Fourth International Conference on Image and Graphics (ICIG 2007).

[9]  Hirotake Yamazoe,et al.  Remote and head-motion-free gaze tracking for real environments with automated head-eye model calibrations , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[10]  Shouqian Sun,et al.  Research on Chinese Paper-cut CAD System , 2007, Fourth International Conference on Image and Graphics (ICIG 2007).

[11]  Gwen Littlewort,et al.  Real Time Face Detection and Facial Expression Recognition: Development and Applications to Human Computer Interaction. , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[12]  Carlos Hitoshi Morimoto,et al.  Free head motion eye gaze tracking using a single camera and multiple light sources , 2006, 2006 19th Brazilian Symposium on Computer Graphics and Image Processing.

[13]  Rainer Stiefelhagen,et al.  Multimodal Identity Tracking in a Smartroom , 2006, AIAI.

[14]  Alberto Del Bimbo,et al.  Robust tracking and remapping of eye appearance with passive computer vision , 2007, TOMCCAP.

[15]  Yang Bo,et al.  Using Trust Metric to Detect Malicious Behaviors in WSNs , 2007, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007).

[16]  Dongheng Li,et al.  Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[17]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Rainer Stiefelhagen,et al.  Multimodal identity tracking in a smart room , 2007, Personal and Ubiquitous Computing.

[19]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[20]  Myron Flickner,et al.  Differences in the infrared bright pupil response of human eyes , 2002, ETRA.

[21]  David Beymer,et al.  Eye gaze tracking using an active stereo head , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[22]  Jun Wang,et al.  Using geometric properties of topographic manifold to detect and track eyes for human-computer interaction , 2007, TOMCCAP.

[23]  Hirotake Yamazoe,et al.  Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking , 2007, ICMI '07.

[24]  Peter D. Lawrence,et al.  A non-contact device for tracking gaze in a human computer interface , 2005, Comput. Vis. Image Underst..

[25]  Ioannis Patras,et al.  Gaze Tracking by Using Factorized Likelihoods Particle Filtering and Stereo Vision , 2006, Third International Symposium on 3D Data Processing, Visualization, and Transmission (3DPVT'06).

[26]  Gang Lu,et al.  Detecting Cumulated Anomaly by a Dubiety Degree based detection Model , 2007, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007).

[27]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.