Evaluating LED-based interface for Lumanote composition creation tool

Composing music typically requires years of music theory experience and knowledge that includes but is not limited to chord progression, melody composition theory, and an understanding of whole-step/half-step passing tones among others. For that reason, certain songwriters such as singers may find a necessity to hire experienced pianists to help compose their music. In order to facilitate the process for beginner and aspiring musicians, we have developed Lumanote, a music composition tool that aids songwriters by presenting real-time suggestions on appropriate melody notes and chord progression. While a preliminary evaluation yielded favorable results for beginners, many commented on the difficulty of having to map the note suggestions displayed on the on-screen interface to the physical keyboard they were playing on. This paper presents the resulting solution: an LED-based feedback system that is designed to be directly attached to any standard MIDI keyboard. This peripheral aims to help map note suggestions directly to the physical keys of a musical keyboard. A study consisting of 22 individuals was conducted to compare the effectiveness of the new LED-based system with the existing computer interface, finding that the vast majority of users preferred the LED system. Three experienced musicians also judged and ranked the compositions, noting significant improvement in song quality when using either system, and citing comparable quality between compositions that used either interface.

[1]  Remco C. Veltkamp,et al.  Automatic Functional Harmonic Analysis , 2013, Computer Music Journal.

[2]  Pamela Burnard,et al.  Reframing creativity and technology: promoting pedagogic change in music education , 2007 .

[3]  Bo Nilsson,et al.  Children's practice of computer-based composition1 , 2005 .

[4]  R. Marsh,et al.  How examples may (and may not) constrain creativity , 1996, Memory & cognition.

[5]  John A. Biles GenJam: Evolutionary Computation Gets a Gig , 2002 .

[6]  Satoru Fukayama,et al.  An interactive music composition system based on autonomous maintenance of musical consistency , 2011 .

[7]  Karrie Karahalios,et al.  Isochords: visualizing structure in music , 2007, GI '07.

[8]  Maud Hickey,et al.  The Computer as a Tool in Creative Music Making , 1997 .

[9]  Elaine Chew,et al.  A Hybrid System for Automatic Generation of Style-Specific A ccompaniment , 2007 .

[10]  Stephen McAdams,et al.  Control parameters for musical instruments: a foundation for new mappings of gesture to sound , 2002, Organised Sound.

[11]  Krzysztof Z. Gajos,et al.  ChordRipple: Recommending Chords to Help Novice Composers Go Beyond the Ordinary , 2016, IUI.

[12]  Hiroshi Ishii,et al.  MirrorFugue: communicating hand gesture in remote piano collaboration , 2010, TEI.

[13]  Juan Pablo Bello,et al.  A Robust Mid-Level Representation for Harmonic Content in Music Signals , 2005, ISMIR.

[14]  Tracy Anne Hammond,et al.  Lumanote: A Real-Time Interactive Music Composition Assistant , 2018, IUI Workshops.

[15]  Mark B. Sandler,et al.  Theory and Evaluation of a Bayesian Music Structure Extractor , 2005, ISMIR.

[16]  Franc Solina,et al.  Educational Possibilities of the Project Colour Visualization of Music , 2011 .

[17]  Dan Morris,et al.  MySong: automatic accompaniment generation for vocal melodies , 2008, CHI.

[18]  Gustavo Diaz-Jerez,et al.  Composing with Melomics: Delving into the Computational World for Musical Inspiration , 2011, Leonardo Music Journal.