Assistive music browsing using self-organizing maps

Music listening is an important activity for many people. Advances in technology have made possible the creation of music collections with thousands of songs in portable music players. Navigating these large music collections is challenging especially for users with vision and/or motion disabilities. In this paper we describe our current efforts to build effective music browsing interfaces for people with disabilities. The foundation of our approach is the automatic extraction of features for describing musical content and the use of self-organizing maps to create two-dimensional representations of music collections. The ultimate goal is effective browsing without using any meta-data. We also describe different control interfaces to the system: a regular desktop application, an iPhone implementation, an eye tracker, and a smart room interface based on Wii-mote tracking.

[1]  Yoshinobu Ebisawa,et al.  Effectiveness of pupil area detection technique using two light sources and image difference method , 1993, Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Societ.

[2]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[3]  Constantine Stephanidis,et al.  A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction , 1996, Assets '96.

[4]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[5]  George Tzanetakis,et al.  Musical genre classification of audio signals , 2002, IEEE Trans. Speech Audio Process..

[6]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[7]  Masataka Goto,et al.  Musicream: New Music Playback Interface for Streaming, Sticking, Sorting, and Recalling Musical Pieces , 2005, ISMIR.

[8]  Sidney S. Fels,et al.  The MUSICtable: A Map-Based Ubiquitous System for Social Interaction with a Digital Music Collection , 2005, ICEC.

[9]  Mario Nöcker,et al.  Databionic Visualization of Music Collections According to Perceptual Distance , 2005, ISMIR.

[10]  Sergi Jordà,et al.  The reacTable: a tangible tabletop musical instrument and collaborative workbench , 2006, SIGGRAPH '06.

[11]  Nicola Orio,et al.  Music Retrieval: A Tutorial and Review , 2006, Found. Trends Inf. Retr..

[12]  George Tzanetakis,et al.  Interactive Content-Aware Music Browsing using the Radio Drum , 2006, 2006 IEEE International Conference on Multimedia and Expo.

[13]  Bin Cui,et al.  Intelligent Music Information Systems: Tools and Methodologies , 2007 .

[14]  Roger Wattenhofer,et al.  Exploring music collections on mobile devices , 2008, Mobile HCI.

[15]  George Tzanetakis,et al.  MARSYAS-0.2: A Case Study in Implementing Music Information Retrieval Systems , 2008 .

[16]  James S. Albus,et al.  Sensor experiments to facilitate robot use in assistive environments , 2008, PETRA '08.

[17]  Ilias Maglogiannis,et al.  Enabling human status awareness in assistive environments based on advanced sound and motion data classification , 2008, PETRA '08.