Piivert: Percussion-based interaction for immersive virtual environments

3D graphical interaction offers a large amount of possibilities for musical applications. However it also carries several limitations that prevent it from being used as an efficient musical instrument. For example, input devices for 3D interaction or new gaming devices are usually based on 3 or 6 degrees-of-freedom tracking combined with push-buttons or joysticks. While buttons and joysticks do not provide good resolution for musical gestures, graphical interaction using tracking may bring enough expressivity but is weakened by accuracy and haptic feedback problems. Moreover, interaction based solely on tracking limit the possibilities brought by graphical interfaces in terms of musical gestures. We propose a new approach that separates the input modalities according to traditional musical gestures. This allows to combine the possibilities of graphical interaction as selection and modulation gestures with the accuracy and the expressivity of musical interaction as excitation gestures. We implement this approach with a new input device, Piivert, which combines 6DOF tracking and pressure detection. We describe associated interaction techniques and show how this new device can be valuable for immersive musical applications.

[1]  Tapio Takala,et al.  Experiments with Virtual Reality Instruments , 2005, NIME.

[2]  Vincent Goudard,et al.  Meta-Instrument 3: a look over 17 years of practice , 2006, NIME.

[3]  Ferran Argelaguet,et al.  Anisomorphic ray-casting manipulation for interacting with 2D GUIs , 2007, Comput. Graph..

[4]  Michel Waisvisz,et al.  The Hands: A Set of Remote MIDI-Controllers , 1985, ICMC.

[5]  Claude Cadoz,et al.  Musique, geste, technologie , 1999 .

[6]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[7]  Ajay Kapur,et al.  A Comparison of Sensor Strategies for Capturing Percussive Gestures , 2005, NIME.

[8]  Frits H. Post,et al.  IntenSelect: using dynamic object rating for assisting 3D object selection , 2005, EGVE'05.

[9]  Xavier Rodet,et al.  Study of haptic and visual interaction for sound and music control in the phase project , 2005 .

[10]  Russell M. Taylor,et al.  VRPN: a device-independent, network-transparent VR peripheral system , 2001, VRST '01.

[11]  Sergi Jordà,et al.  INTERACTIVE MUSIC SYSTEMS FOR EVERYONE: EXPLORING VISUAL FEEDBACK AS A WAY FOR CREATING MORE INTUITIVE, EFFICIENT AND LEARNABLE INSTRUMENTS , 2003 .

[12]  T. Ichikawa,et al.  Egocentric Object Manipulation in Virtual Environments: Empirical Evaluation of Interaction Techniques , 1998, Comput. Graph. Forum.

[13]  Axel G. E. Mulder Design of virtual three-dimensional instruments for sound control , 1998 .

[14]  Chris Chafe,et al.  Playing by feel: incorporating haptic feedback into computer-based musical instruments , 2001 .