Force-Feedback Hand Controllers for Musical Interaction

This thesis investigates the possibility of exploiting haptic force-feedback technology for interacting with virtual musical instruments. A survey of current software solutions for creating haptic virtual environments is provided, with a discussion on the need to integrate such a platform with currently accepted solutions for audio research. A system was developed to combine a haptic programming library with a physical dynamics engine and to expose its functionality through the Open Sound Control (OSC) protocol, an increasingly accepted standard for communication within the audio software and hardware domain. Using OSC messaging, simple 3D objects can be instantiated and constraints on their movement can be specified, allowing the description of physically dynamic mechanisms. Collision events as well as properties of the objects can be transmitted to the audio system continually to be used for modulating audio synthesis parameters. Some examples of simple virtual musical instruments created with the aid of this system are provided.

[1]  Perry R. Cook,et al.  ChucK: a programming language for on-the-fly, real-time audio synthesis and multimedia , 2004, MULTIMEDIA '04.

[2]  Philip L. Davidson,et al.  Synthesis and Control on Large Scale Multi-Touch Sensing Displays , 2006, NIME.

[3]  Chris Chafe,et al.  Playing by feel: incorporating haptic feedback into computer-based musical instruments , 2001 .

[4]  Matthew Hutchins,et al.  Augmented reality haptics: using ARToolKit for display of haptic applications , 2003, 2003 IEEE International Augmented Reality Toolkit Workshop.

[5]  Marcelo M. Wanderley,et al.  Gestural control of sound synthesis , 2004, Proceedings of the IEEE.

[6]  James McCartney,et al.  Rethinking the Computer Music Language: SuperCollider , 2002, Computer Music Journal.

[7]  M. Sile O'Modhrain,et al.  DAMPER: a platform for effortful interface development , 2007, NIME '07.

[8]  Marcelo M. Wanderley,et al.  Vibrotactile Feedback in Digital Musical Instruments , 2006, NIME.

[9]  Robert J. K. Jacob,et al.  Integrality and separability of input devices , 1994, TCHI.

[10]  Thor Magnusson ixi software: The Interface as Instrument , 2005, NIME.

[11]  John Kenneth Salisbury,et al.  Making graphics physically tangible , 1999, CACM.

[12]  .. Wanderley Typology of Tactile Sounds and their Synthesis in Gesture-Driven Computer Music Performance , 2000 .

[13]  Claude Cadoz,et al.  A Modular Feedback Keyboard Design , 1990 .

[14]  Ian Oakley,et al.  Putting the feel in ’look and feel‘ , 2000, CHI.

[15]  Nadia Magnenat-Thalmann,et al.  INTERACTIVE VIRTUAL HAIR-DRESSING ROOM , 2006 .

[16]  Helge Ritter,et al.  Real-Time Control of Sonification Models with an Audio-Haptic Interface , 2002 .

[17]  Frederick P. Brooks,et al.  Feeling and seeing: issues in force display , 1990, I3D '90.

[18]  Anselmo Lastra,et al.  Proceedings of the ACM SIGGRAPH/EUROGRAPHICS conference on Graphics hardware , 2002 .

[19]  Ming C. Lin,et al.  Collision Detection between Geometric Models: A Survey , 1998 .

[20]  Stephen D. Laycock,et al.  Recent Developments and Applications of Haptic Devices , 2003, Comput. Graph. Forum.

[21]  Dinesh Manocha,et al.  Partitioning and Handling Massive Models for Interactive Collision Detection , 1999, Comput. Graph. Forum.

[22]  Dinesh Manocha,et al.  CULLIDE: interactive collision detection between complex models in large environments using graphics hardware , 2003, HWWS '03.

[23]  Davide Rocchesso,et al.  Sounding Objects , 2003, IEEE Multim..

[24]  Dinesh K. Pai,et al.  The AHI: an audio and haptic interface for contact interactions , 2000, UIST '00.

[25]  Sergi Jordà,et al.  Sonigraphical Instruments: From FMOL to the reacTable* , 2003, NIME.

[26]  Dinesh K. Pai,et al.  FoleyAutomatic: physically-based sound effects for interactive simulation and animation , 2001, SIGGRAPH.

[27]  M. Sile O'Modhrain,et al.  PebbleBox and CrumbleBag: Tactile Interfaces for Granular Synthesis , 2004, NIME.

[28]  Axel G. E. Mulder Towards a choice of gestural constraints for instrumental performers , 2000 .

[29]  Camille Goudeseune,et al.  A Manifold Interface for a High Dimensional Control Space , 1995, International Conference on Mathematics and Computing.

[30]  Charlotte Magnusson,et al.  Preliminary test in a complex virtual dynamic haptic audio environment , 2006 .

[31]  S Puckette Miller,et al.  Pure Data : another integrated computer music environment , 1996 .

[32]  Vincent Hayward,et al.  The pantograph: a large workspace haptic device for multimodal human computer interaction , 1994, CHI '94.

[33]  Peter D. Bennett “HAP-KIT” A HAPTIC INTERFACE FOR A VIRTUAL DRUM-KIT , 2004 .

[34]  Vincent Hayward,et al.  Performance Measures for Haptic Interfaces , 1996 .

[35]  William V. Baxter,et al.  A versatile interactive 3D brush model , 2004, 12th Pacific Conference on Computer Graphics and Applications, 2004. PG 2004. Proceedings..

[36]  Chen Shen,et al.  Synthesizing sounds from rigid-body simulations , 2002, SCA '02.

[37]  L. Sentis,et al.  The CHAI Libraries , 2003 .

[38]  Teemu Mäki-Patola,et al.  Physics-based modeling of musical instruments for interactive virtual reality , 2004, IEEE 6th Workshop on Multimedia Signal Processing, 2004..

[39]  Alexander Refsum Jensenius,et al.  On the Development of a System for Gesture Control of Spatialization , 2006, ICMC.

[40]  Claude Cadoz,et al.  Cordis-anima: A modeling and simulation system for sound and image synthesis , 1993 .

[41]  Grigore C. Burdea,et al.  Force and Touch Feedback for Virtual Reality , 1996 .

[42]  James M. Van Verth,et al.  Adding force feedback to graphics systems: issues and solutions , 1996, SIGGRAPH.

[43]  Ross Kirk,et al.  TACTILE FEEDBACK IN THE CONTROL OF A PHYSICAL MODELLING MUSIC SYNTHESISER , 2002 .

[44]  Durand R. Begault,et al.  Sensitivity to haptic-audio asynchrony , 2003, ICMI '03.

[45]  K. Salisbury,et al.  Haptic Rendering of Surfaces Defined by Implicit Functions , 1997, Dynamic Systems and Control.

[46]  Sanjay Mehrotra,et al.  Realistic cross-platform haptic applications using freely-available libraries , 2004, 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings..

[47]  Roberto Oboe,et al.  Multi-instrument virtual keyboard - the MIKEY project , 2002, NIME.

[48]  Mark Danks Real-time Image and Video Processing in GEM , 1997, ICMC.

[49]  Jean-Loup Florens Expressive Bowing on a Virtual String Instrument , 2003, Gesture Workshop.

[50]  Marcelo M. Wanderley,et al.  From controller to sound: Tools for Collaborative Development of Digital Musical Instruments , 2007, ICMC.

[51]  Axel Mulder Virtual Musical Instruments: Accessing the Sound Synthesis Universe as a Performer , 2007 .

[52]  Ming C. Lin,et al.  A Simulation-based VR System for Interactive Hairstyling , 2006, IEEE Virtual Reality Conference (VR 2006).

[53]  Xavier Rodet,et al.  Gestural Control of a Real-Time Physical Model of a Bowed String Instrument , 1999, ICMC.

[54]  Thomas Massie A Tangible Goal for 3D Modeling , 1998, IEEE Computer Graphics and Applications.

[55]  Gunnar Johannsen Special Issue on Engineering and Music - Supervisory Control and Auditory Communication , 2004, Proc. IEEE.

[56]  A. J. Bongers Tactual display of sound properties in electronic musical instruments , 1998 .

[57]  Max V. Mathews,et al.  The PLANK: Designing a simple Haptic Controller , 2002, NIME.

[58]  Charles Nichols The vBow: development of a virtual violin bow haptic human-computer interface , 2002 .

[59]  Claude Cadoz,et al.  Instrumental Gestures and Musical Composition , 1988, ICMC.

[60]  Annie Luciani,et al.  From action to sound: a challenging perspective for haptics , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[61]  Ming C. Lin,et al.  Accurate and Fast Proximity Queries Between Polyhedra Using Convex Surface Decomposition , 2001, Comput. Graph. Forum.

[62]  Vincent Hayward,et al.  A New Compuatitional Model of Friction Applied to Haptic Rendering , 1999, ISER.

[63]  M. Sile O'Modhrain,et al.  Musical Muscle Memory and the Haptic Display of Performance Nuance , 1996, ICMC.

[64]  Axel G. E. Mulder Design of virtual three-dimensional instruments for sound control , 1998 .

[65]  Chris Chafe Tactile Audio Feedback , 1993, ICMC.

[66]  Ali Momeni,et al.  OpenSound Control: State of the Art 2003 , 2003, NIME.

[67]  Max V. Mathews,et al.  Scanned Synthesis , 2000, ICMC.

[68]  V. Hayward,et al.  An Experiment on Length Perception with a Virtual Rolling Stone , 2006 .

[69]  Mark R. Cutkosky,et al.  Friction modeling and display in haptic applications involving user performance , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[70]  Ccrma,et al.  EXPRESSIVE CONTROLLERS FOR BOWED STRING PHYSICAL MODELS , 2001 .

[71]  Vincent Hayward,et al.  Haptic interfaces and devices , 2004 .

[72]  Bill Verplank Haptic Music Exercises , 2005, NIME.

[73]  Perttu Hämäläinen,et al.  Latency Tolerance for Gesture Controlled Continuous Sound Instrument without Tactile Feedback , 2004, ICMC.

[74]  Domenico Prattichizzo,et al.  The Haptik Library A Component Based Architecture for Uniform Access to Haptic Devices , 2007 .

[75]  Marcelo M. Wanderley,et al.  The Importance of Parameter Mapping in Electronic Instrument Design , 2002, NIME.

[76]  Camille Goudeseune,et al.  Model based interactive sound for an immersive virtual environment , 1994, ICMC.

[77]  Sidney Fels,et al.  Empty-handed Gesture Analysis in Max/FTS , 1997 .