eyeSay: Eye Electrooculography Decoding with Deep Learning

Consumer electronics that can decode eye movement-induced bio-potential will enable many practices, from voice-free communication, attention tracking, to virtual or augmented reality. We propose a novel deep learning-enabled approach for eye Electrooculography decoding, towards voice-free communication for patients with amyotrophic lateral sclerosis. We have designed a multi-stage convolutional neural network to decode eye dynamics. Our approach and promising results will directly contribute to voice-free communications for patients, and greatly advance the ubiquitous eye EOG-based smart health area.