Development of Real-time Face and Gaze Measurement System and Its Application to Intelligent Interfaces

The information obtained by observing a face of a person (e.g., what the person is looking at, how long, with what expression. . . ) plays an important role as non-verbal information in human-human communication. If a computer system is capable of measuring such non-verbal information, it will be useful for the system to recognize the intention and the emotion of its user. This will reduce the burden of the user, which will eventually lead to realizing attentive interfaces. In this paper, novel intelligent computer interfaces are described, which are based on our face measurement system for facial motion and gaze direction.

[1]  Alex Pentland,et al.  Visually Controlled Graphics , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Alexander Zelinsky,et al.  Behavior recognition based on head pose and gaze direction measurement , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[3]  Gilbert Cockton,et al.  CHI '03 Extended Abstracts on Human Factors in Computing Systems , 2003, CHI 2003.

[4]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[5]  John Paulin Hansen,et al.  Eye-gaze control of multimedia systems , 1995 .

[6]  Tsukasa Ogasawara,et al.  Interaction of receptionist ASKA using vision and speech information , 2003, Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003..

[7]  Gerald Sommer,et al.  Affine real-time face tracking using Gabor wavelet networks , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[8]  Alexander Zelinsky,et al.  Real-time visual recognition of facial gestures for human-computer interaction , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[9]  Masanao Koeda,et al.  Portable facial information measurement system and its application to human modeling and human interfaces , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[10]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[11]  Fadi Dornaika,et al.  Head and Facial Animation Tracking using Appearance-Adaptive Models and Particle Filters , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[12]  Alexander Zelinsky,et al.  An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[13]  K. Takemura,et al.  What you look at is what you control: a universal remote control based on gaze measurement technology , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[14]  Trevor Darrell,et al.  Stereo tracking using ICP and normal flow constraint , 2002, Object recognition supported by user interaction for service robots.