We have developed Noggin and Glass Gab, two Google Glass based systems to allow people with disabilities to choose Yes or No and spell out messages just by moving their head. Our goal is to allow people who cannot speak and do not have reliable control of their hands to communicate just with head movements. The advantage of using Google Glass is that the user, perhaps in a wheelchair, does not need a notebook computer or tablet computer. The system displays a pointer on the Google Glass screen and Yes and No buttons (Noggin) or an onscreen keyboard (Glass Gab). The user moves the pointer by head movement. Selection is made using dwell time, by having the pointer dwell over the buttons or letters for a second. The user can have Google Glass speak the message. Head motion is detected using the three axis gyroscope built into Glass.
[1]
M. Betke,et al.
The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities
,
2002,
IEEE Transactions on Neural Systems and Rehabilitation Engineering.
[2]
Peter Olivieri,et al.
EagleEyes: An Eye Control System for Persons with Disabilities
,
2013
.
[3]
Margrit Betke,et al.
THE CAMERA MOUSE: PRELIMINARY INVESTIGATION OF AUTOMATED VISUAL TRACKING FOR COMPUTER ACCESS
,
2000
.
[4]
J J Tecce,et al.
Eye movement control of computer functions.
,
1998,
International journal of psychophysiology : official journal of the International Organization of Psychophysiology.