Large-Scale Mobile Audio Environments for Collaborative Musical Interaction

New application spaces and artistic forms can emerge when users are freed from constraints. In the general case of human-computer interfaces, users are often confined to a fixed location, severely limiting mobility. To overcome this constraint in the context of musical interaction, we present a system to manage large-scale collaborative mobile audio environments, driven by user movement. Multiple participants navigate through physical space while sharing overlaid virtual elements. Each user is equipped with a mobile computing device, GPS receiver, orientation sensor, microphone, headphones, or various combinations of these technologies. We investigate methods of location tracking, wireless audio streaming, and state management between mobile devices and centralized servers. The result is a system that allows mobile users, with subjective 3-D audio rendering, to share virtual scenes. The audio elements of these scenes can be organized into large-scale spatial audio interfaces, thus allowing for immersive mobile performance, locative audio installations, and many new forms of collaborative sonic activity.

[1]  A. A. Sawchuk,et al.  A Second Report on the User Experiments in the Distributed Immersive Performance Project , 2005 .

[2]  Richard L. McKinley,et al.  The Interaction Between Head-Tracker Latency, Source Duration, and Response Time in the Localization of Virtual Sound Sources , 2004, ICAD.

[3]  Elaine Chew,et al.  DISTRIBUTED IMMERSIVE PERFORMANCE , 2004 .

[4]  Jeremy R. Cooperstock,et al.  A SPATIAL INTERFACE FOR AUDIO AND MUSIC PRODUCTION , 2006 .

[5]  Lars Erik Holmquist,et al.  Sonic City: The Urban Environment as a Musical Interface , 2003, NIME.

[6]  J. Cooperstock,et al.  AudioScape : A Pure Data library for management of virtual environments and spatial audio , 2007 .

[7]  Jeremy R. Cooperstock,et al.  A Mobile Wireless Augmented Guitar , 2008, NIME.

[8]  Gerald Schuller,et al.  Reduced Bit Rate Ultra Low Delay Audio Coding , 2006 .

[9]  Dharma P. Agrawal,et al.  GPS: Location-Tracking Technology , 2002, Computer.

[10]  Tina Blaine,et al.  Contexts of Collaborative Musical Experiences , 2003, NIME.

[11]  Joseph Michael Rozier Here&There : an augmented reality system of linked audio , 2000 .

[12]  Jeremy R. Cooperstock,et al.  A Paradigm for Physical Interaction with Sound in 3-D Audio Space , 2006, ICMC.

[13]  Nicolas Bouillot,et al.  nJam user experiments: enabling remote musical interaction from milliseconds to seconds , 2007, NIME '07.

[14]  Thierry Turletti,et al.  Multicast in 802.11 WLANs: an experimental study , 2006, MSWiM '06.

[15]  Günter Geiger PDa: Real Time Signal Processing and Sound Generation on Handheld Devices , 2003, ICMC.

[16]  Philippe Jacquet,et al.  Optimized Link State Routing Protocol (OLSR) , 2003, RFC.

[17]  Jeremy R. Cooperstock,et al.  A framework for immersive spatial audio performance , 2006, NIME.

[18]  Atau Tanaka,et al.  A Framework for Spatial Interaction in Locative Media , 2006, NIME.

[19]  Atau Tanaka Mobile Music Making , 2004, NIME.

[20]  ParentC.,et al.  The MurMur project , 2006 .

[21]  Michael Wozniewski,et al.  USER-SPECIFIC AUDIO RENDERING AND STEERABLE SOUND FOR DISTRIBUTED VIRTUAL ENVIRONMENTS , 2007 .

[22]  Brian D. Simpson,et al.  The detectability of headtracker latency in virtual audio displays , 2005 .

[23]  Marcus Specht,et al.  MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS , 2005 .

[24]  M. Sile O'Modhrain,et al.  GpsTunes: controlling navigation via audio feedback , 2005, Mobile HCI.

[25]  M. Wing,et al.  Consumer-Grade Global Positioning System (GPS) Accuracy and Reliability , 2005 .