Eye detection and tracking
暂无分享,去创建一个
We can judge the importance of automatic detection and tracking of facial features, such as eyes, irises, eyelids, etc., by the list of potential applications. Some of these applications are in human-computer interaction; for instance, the computer may want to know what the user is looking at on the screen. Other applications can be found in compression techniques like MPEG4, where eye and iris information is part of the communication stream. Still another application can be found in driver behavior analysis. In this application, attentiveness on the driver's part is directly related to safety. For example, the car may want to know when the driver is tired and keeps closing his/her eyes; if the car finds that the driver is not attentive it may send visual or auditory signals to get the driver's attention. Of course, identifying and locating the face and the eyes are integral parts of most face recognition and classification algorithms, where eye information is primarily used to normalize the face. The contributions of this thesis are in two areas. The first is detection and verification of the face by locating the eyes in a color image of the face. The second is location of features of the eyes (the irises and eyelids) and identification of their behavior by tracking them over a sequence of images.
The face is detected as a large flesh-colored region, and anthropometric data are then used to estimate the size and separation of the eyes. Our first method of eye detection uses a linear filtering approach applied to the gray-level image of the face; our second method uses nonlinear filters, applied to the color face image, to detect the corners of the eyes. Both methods were tested on two datasets. The first method had a good detection rate, but also gave many false alarms; the second method had a 90% detection rate with no false alarms.
After locating the eye corners, we detect the eyelids and irises in every frame of an image sequence. We perform a frame-to-frame analysis of the movements of the irises and eyelids to determine changes in gaze direction and blinking, respectively. This analysis is improved upon by using motion information in the form of normal flow. We model the head and eyes and determine their respective motions. Using our models we determine the head-independent motions of the irises and eyelids by stabilizing for the head motion. The head-independent motions of the irises are used to determine behaviors like saccades and smooth pursuit. Tracking the upper eyelid and using the distance between its apex and the center of the iris, we detect instances of eye closure during blinking. In our experiments, we successfully located the irises in every frame in which the eye was fully or partially open. The eyelids were successfully located 80% of the time.