A Gaze-Controlled Interface to Virtual Reality Applications for Motor-and Speech-Impaired Users

This project aims to overcome the access barriers to virtual worlds for motorand speech-impaired users by building a gaze-controlled interface for Second Life that will enable them to interact with the virtual world by just moving their eyes. We have conducted a study to assess (1) the facilitation of gaze-controlled text input using word prediction technique to speed up chatstyle text input in virtual worlds, (2) the influence of screen layout on the efficiency of text input, (3) the effect of the maximum number of suggested words on typing efficiency, and (4) the performance of non-disabled vs. motorimpaired users. Non-disabled subjects and Amyotrophic Lateral Sclerosis (ALS) patients have participated in our experiment. Experimental results show that on average the patients took less time and fewer corrections per letter than did the non-disabled subjects. This finding suggests that our interface design is suitable for motor-impaired users.