Although new sensor devices and data streams are increasingly used for musical expression, and although eye-tracking devices have become increasingly cost-effective and prevalent in research and as a means of communication for people with severe motor impairments, eye-controlled musical expression nonetheless remains somewhat elusive and minimally explored. This paper (a) identifies a number of fundamental human eye movement capabilities and constraints which determine in part what can and cannot be musically expressed with eye movements, (b) reviews prior work on eye-controlled musical expression, and (c) analyzes and provides a taxonomy of what has been done, and what will need to be addressed in future eye-controlled musical instruments. The fundamental human constraints and processes that govern eye movements create a challenge for eye-controlled music in that the instrument needs to be designed to motivate or at least permit specific unique visual goals, each of which when accomplished must then be mapped, using the eye tracker and some sort of sound generator, to different musical outcomes. The control of the musical instrument is less direct than if it were played with muscles that can be controlled in a more direct manner, such as the muscles in the hands.
[1]
Zacharias Vamvakousis,et al.
The EyeHarp: A Gaze-Controlled Musical Instrument
,
2011
.
[2]
Carol Parkinson,et al.
Proceedings of the 7th international conference on New interfaces for musical expression
,
2007
.
[3]
Anthony J. Hornof,et al.
Eyedraw: a system for drawing pictures with eye movements
,
2004,
ACM SIGACCESS Access. Comput..
[4]
Anthony J. Hornof,et al.
Easy post-hoc spatial recalibration of eye tracking data
,
2014,
ETRA.
[5]
Oleg Spakov,et al.
Fast gaze typing with an adjustable dwell time
,
2009,
CHI.
[6]
Rafael Ramírez,et al.
Temporal Control In the EyeHarp Gaze-Controlled Musical Interface
,
2012,
NIME.
[7]
Anthony J. Hornof,et al.
EyeMusic: Making Music with the Eyes
,
2004,
NIME.
[8]
Greg Schiemer,et al.
Oculog: playing with eye movements
,
2007,
NIME '07.
[9]
M. Weiser,et al.
An empirical comparison of pie vs. linear menus
,
1988,
CHI '88.
[10]
Anthony J. Hornof,et al.
EyeMusic: performing live music and multimedia compositions with eye movements
,
2007,
NIME '07.
[11]
Andrea Polli.
Active Vision: Controlling Sound with Eye Movements
,
1999,
Leonardo.