The EyeHarp: A Gaze-Controlled Digital Musical Instrument

We present and evaluate the EyeHarp, a new gaze-controlled Digital Musical Instrument, which aims to enable people with severe motor disabilities to learn, perform, and compose music using only their gaze as control mechanism. It consists of (1) a step-sequencer layer, which serves for constructing chords/arpeggios, and (2) a melody layer, for playing melodies and changing the chords/arpeggios. We have conducted a pilot evaluation of the EyeHarp involving 39 participants with no disabilities from both a performer and an audience perspective. In the first case, eight people with normal vision and no motor disability participated in a music-playing session in which both quantitative and qualitative data were collected. In the second case 31 people qualitatively evaluated the EyeHarp in a concert setting consisting of two parts: a solo performance part, and an ensemble (EyeHarp, two guitars, and flute) performance part. The obtained results indicate that, similarly to traditional music instruments, the proposed digital musical instrument has a steep learning curve, and allows to produce expressive performances both from the performer and audience perspective.

[1]  M. Stoykov,et al.  A review of bilateral training for upper extremity hemiparesis. , 2009, Occupational therapy international.

[2]  D Schön,et al.  Comparison between Language and Music , 2001, Annals of the New York Academy of Sciences.

[3]  G. Pagnoni,et al.  Age effects on gray matter volume and attentional performance in Zen meditation , 2007, Neurobiology of Aging.

[4]  Soumitra Bhat,et al.  TouchTone: an electronic musical instrument for children with hemiplegic cerebral palsy , 2010, TEI '10.

[5]  M. Cheung,et al.  Music training improves verbal memory , 1998, Nature.

[6]  Anthony J. Hornof The Prospects For Eye-Controlled Musical Performance , 2014, NIME.

[7]  Pauline Oliveros,et al.  A Musical Improvisation Interface for People With Severe Physical Disabilities , 2011 .

[8]  Veronica Teichrieb,et al.  Considering Audience's View Towards an Evaluation Methodology for Digital Musical Instruments , 2012, NIME.

[9]  Andreas Paepcke,et al.  Improving the accuracy of gaze input for interaction , 2008, ETRA.

[10]  N. Kraus,et al.  Musical Experience and the Aging Auditory System: Implications for Cognitive Abilities and Hearing Speech in Noise , 2011, PloS one.

[11]  J. Staiger,et al.  Increased corpus callosum size in musicians , 1995, Neuropsychologia.

[12]  T. Lawyer,et al.  AMYOTROPHIC LATERAL SCLEROSIS: A Clinicoanatomic Study of Fifty-Three Cases , 1953 .

[13]  N. Kraus,et al.  Musical experience shapes top-down auditory mechanisms: Evidence from masking and auditory attention performance , 2010, Hearing Research.

[14]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[15]  Todd Winkler Creating Interactive Dance with the Very Nervous System , 1997 .

[16]  Päivi Majaranta,et al.  CHAPTER 9 – Text Entry by Gaze: Utilizing Eye Tracking , 2007 .

[17]  M. Cheung,et al.  Music training improves verbal but not visual memory: cross-sectional and longitudinal explorations in children. , 2003, Neuropsychology.

[18]  A. Ludolph,et al.  Amyotrophic lateral sclerosis. , 2012, Current opinion in neurology.

[19]  M. Bangert,et al.  Mapping perception to action in piano practice: a longitudinal DC-EEG study , 2003, BMC Neuroscience.

[20]  M. Sile O'Modhrain,et al.  A Framework for the Evaluation of Digital Musical Instruments , 2011, Computer Music Journal.

[21]  Thomas R. Barrick,et al.  Voxel-Based Morphometry Reveals Increased Gray Matter Density in Broca's Area in Male Symphony Orchestra Musicians , 2002, NeuroImage.

[22]  Anke Huckauf,et al.  Gazing with pEYEs: towards a universal input for various applications , 2008, ETRA.

[23]  V. Minichiello,et al.  The contribution of music to quality of life in older people: an Australian qualitative study , 2005, Ageing and Society.

[24]  W. Andrew Schloss,et al.  Using Contemporary Technology in Live Performance: The Dilemma of the Performer , 2003 .

[25]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[26]  Don D. Coffman Music and quality of life in older adults. , 2002 .

[27]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[28]  Takehiko Ohno,et al.  Features of eye gaze interface for selection tasks , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[29]  G. Schlaug,et al.  Music Making as a Tool for Promoting Brain Plasticity across the Life Span , 2010, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry.

[30]  J K JacobRobert,et al.  The use of eye movements in human-computer interaction techniques , 1991 .

[31]  Ravin Balakrishnan,et al.  Acquisition of expanding targets , 2002, CHI.

[32]  F. Gerstenbrand,et al.  Varieties of the locked-in syndrome , 1979, Journal of Neurology.

[33]  Loïc Kessous,et al.  Expressiveness and Digital Musical Instrument Design , 2005 .

[34]  Darius Miniotas Application of Fitts' law to eye gaze interaction , 2000, CHI Extended Abstracts.

[35]  Marcelo M. Wanderley,et al.  The Importance of Parameter Mapping in Electronic Instrument Design , 2002, NIME.

[36]  O. Hardiman,et al.  Amyotrophic lateral sclerosis , 2011, The Lancet.

[37]  Helen J. Neville,et al.  Effects of music training on brain and cognitive development in under-privileged 3- to 5-year-old children : Preliminary results , 2008 .

[38]  G. Schlaug,et al.  Brain Structures Differ between Musicians and Non-Musicians , 2003, The Journal of Neuroscience.

[39]  J. Selhorst,et al.  "Locked-in" syndrome. , 1987, Stroke.

[40]  Zacharias Vamvakousis,et al.  The EyeHarp: A Gaze-Controlled Musical Instrument , 2011 .

[41]  C. Lam,et al.  Musician Enhancement for Speech-In-Noise , 2009, Ear and hearing.

[42]  Steve Benford,et al.  Designing the spectator experience , 2005, CHI.

[43]  R Kirk,et al.  Computer music in the service of music therapy: the MIDIGRID and MIDICREATOR systems. , 1994, Medical engineering & physics.

[44]  M. Delargy,et al.  Locked-in syndrome , 2005, BMJ : British Medical Journal.

[45]  B. Rockstroh,et al.  Increased Cortical Representation of the Fingers of the Left Hand in String Players , 1995, Science.

[46]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[47]  H. Hegner,et al.  Estimating acceptable noise-levels on gaze and mouse selection by zooming , 2008, 2008 Annual IEEE Student Paper Conference.

[48]  Alan C. Evans,et al.  Musical Training Shapes Structural Brain Development , 2009, The Journal of Neuroscience.

[49]  Tim Swingler "That Was Me!": Applications of the Soundbeam MIDI Controller as a Key to Creative Communication, Learning, Independence and Joy. , 1998 .

[50]  John Paulin Hansen,et al.  Command Without a Click: Dwell Time Typing by Mouse and Gaze Selections , 2003, INTERACT.

[51]  Rafael Ramírez,et al.  Temporal Control In the EyeHarp Gaze-Controlled Musical Interface , 2012, NIME.

[52]  Shumin Zhai,et al.  Human on-line response to target expansion , 2003, CHI '03.