Human-computer interaction using eye-gaze input

A description is given of Erica, a computer workstation with a unique user interface. The workstation is equipped with imaging hardware and software, which automatically record a digital portrait of the user's eye. From the features of the current portrait, the interface calculates the approximate location of the user's eye-gaze on the computer screen. The computer then executes commands associated with the menu option currently displayed at this screen location. In this way, the user can interact with the computer, run applications software, and manage peripheral devices-all simply by looking at an appropriate sequence of menu options displayed on the screen. The eye-gaze interface technology, its implementation in Erica, and its application as a prosthetic device are described. >