Human–Computer Interaction and Music

In this chapter, the use of advanced human computer interfaces to create innovative interaction paradigms for music applications (music creation, music manipulation, music games, etc.) is explored. The advances in the design and implementation of sensing technologies have provided the means to create new ways to interact with computers in a more natural way than the conventional computer framework with a mouse and a keyboard. More involving ans immersive experiences can be offered to the user than to these technologies. However, there is no silver bullet: each kind of sensing technology excels at some fields and lack others. Each application will demand its very own selection of sensors and the development of an adequate interaction metaphor. In this chapter, some of the most commonly used technologies for motion sensing are presented with as special focus on the possibilities of a 3D camera sensor (i.e. kinect) with regard to the design of human computer interfaces for musin interaction. We will present our findings in the studies we have conducted using these devices to develop augmented instruments. These include a drumkit stimulator or a virtual theremin. Additionally, the use of this type of interface for other music applications will be discussed. A description of the technical issues that need to be addressed to successfully implement these interaction paradigms is also given.

[1]  Michael Gleicher,et al.  Evaluating video-based motion capture , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[2]  Maurizio Mancini,et al.  A Virtual Head Driven by Music Expressivity , 2007, IEEE Transactions on Audio, Speech, and Language Processing.

[3]  Ginevra Castellano,et al.  Expressive control of music and visual media by full-body movement , 2007, NIME '07.

[4]  G. Kwakkel,et al.  Rehabilitation, exercise therapy and music in patients with Parkinson's disease: a meta-analysis of the effects of music-based movement therapy on walking ability, balance and quality of life. , 2012, Parkinsonism & related disorders.

[5]  Greg Corness,et al.  Playing with the sound maker: do embodied metaphors help children learn? , 2008, IDC.

[6]  Elise van den Hoven,et al.  MoSo tangibles: evaluating embodied learning , 2010, TEI.

[7]  Norman H. Villaroman,et al.  Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor , 2011, SIGITE '11.

[8]  Golan Levin,et al.  In-situ speech visualization in real-time interactive installation and performance , 2004, NPAR '04.

[9]  Wei Liu,et al.  Body music: physical exploration of music theory , 2008, Sandbox '08.

[10]  Sergi Jordà,et al.  The reactable: tangible and tabletop music performance , 2010, CHI Extended Abstracts.

[11]  Udo Zoelzer,et al.  DAFX: Digital Audio Effects , 2011 .

[12]  Christian Dittmar,et al.  Concept, implementation and evaluation of an improvisation based music video game , 2009, 2009 International IEEE Consumer Electronics Society's Games Innovations Conference.

[13]  Ah-Fur Lai,et al.  Development of a Mobile Rhythm Learning System Based on Digital Game-Based Learning Companion , 2011, Edutainment.

[14]  H. Miwa,et al.  Parkinsonism and Related Disorders , 2011 .

[15]  Jeffrey J. Scott,et al.  An audio processing library for game development in flash , 2009, 2009 International IEEE Consumer Electronics Society's Games Innovations Conference.

[16]  J. McDowall,et al.  Interactive music video games and children's musical development , 2012, British Journal of Music Education.

[17]  Daniel Torres,et al.  Using Music to Interact with a Virtual Character , 2005, NIME.

[18]  Lars Schmidt-Thieme,et al.  RFID-Enhanced Museum for Interactive Experience , 2011, MM4CH.

[19]  Mohan M. Trivedi,et al.  Audiovisual Information Fusion in Human–Computer Interfaces and Intelligent Environments: A Survey , 2010, Proceedings of the IEEE.

[20]  Simon Holland,et al.  Feeling the beat where it counts: fostering multi-limb rhythm skills with the haptic drum kit , 2010, TEI '10.

[21]  Tomás Pajdla,et al.  3D with Kinect , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[22]  Christopher M. Smith,et al.  An investigation of current virtual reality interfaces , 1997, CROS.

[23]  Greg Welch,et al.  Motion Tracking: No Silver Bullet, but a Respectable Arsenal , 2002, IEEE Computer Graphics and Applications.

[24]  Cristina Urdiales,et al.  Automatic edition of songs for Guitar Hero/Frets on Fire , 2009, 2009 IEEE International Conference on Multimedia and Expo.