A human-computer interface design using automatic gaze tracking

This paper describes the design and implementation of a human-computer interface using an in-house developed automatic gaze tracking system. A focus has been placed on developing an inexpensive non-invasive gaze-tracking computer interface by which the user can control his/her computer input in a hands-free manner. In the underlying project, infrared (IR) light emitting diodes (LEDs) are placed around a computer monitor to produce reference corneal glints from the user's eye and to illuminate the user's pupil. An IR-sensitive video camera is then used to capture images of these glints. A graphical user interface is used to gather calibration glint information from the user when gazing at six strategically place calibration points. A linear model is derived from these data and utilized to map the vertical and horizontal displacements of the glints with respect to certain physical landmarks of the user's pupil onto the corresponding point of gaze on the monitor. The design is capable of real-time performance and was evaluated with many volunteers with good success rate.