Studies on customisation-driven digital music instruments
暂无分享,去创建一个
[1] Diemo Schwarz,et al. Rich Contacts: Corpus-Based Convolution of Contact Interaction Sound for Enhanced Musical Expression , 2014, NIME.
[2] Julius O. Smith,et al. A Tangible Virtual Vibrating String: A Physically Motivated Virtual Musical Instrument Interface , 2008, NIME.
[3] Kevin Karplus,et al. Digital Synthesis of Plucked-String and Drum Timbers , 1983 .
[4] Michael Gurevich,et al. Playing with Constraints: Stylistic Variation with a Simple Electronic Instrument , 2012, Computer Music Journal.
[5] Norbert Schnell,et al. Modular musical objects towards embodied control of digital music , 2011, Tangible and Embedded Interaction.
[6] Alexander Refsum Jensenius,et al. Musical Gestures: concepts and methods in research , 2010 .
[7] Wendy E. Mackay,et al. Triggers and barriers to customizing software , 1991, CHI.
[8] Naga K. Govindaraju,et al. Sound synthesis for impact sounds in video games , 2011, SI3D.
[9] C. Dennis Allen,et al. User customization of a word processor , 1996, CHI.
[10] Robert Xiao,et al. WorldKit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces , 2013, CHI.
[11] Augusto Sarti,et al. Acoustic localization of tactile interactions for the development of novel tangible interfaces , 2005 .
[12] Andrew D. Wilson. TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.
[13] Alexander Refsum Jensenius,et al. To gesture or Not? An Analysis of Terminology in NIME Proceedings 2001-2013 , 2014, NIME.
[14] Orit Shaer,et al. Reality-based interaction: a framework for post-WIMP interfaces , 2008, CHI.
[15] M. Leman. Embodied Music Cognition and Mediation Technology , 2007 .
[16] Roberto Mario Aimi. Hybrid percussion : extending physical instruments using sampled acoustics , 2007 .
[17] Yang Li,et al. Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.
[18] Toby Gifford,et al. Should Music Interaction Be Easy? , 2013, Music and Human-Computer Interaction.
[19] Lynn Nielsen-Bohlman,et al. The toolkit. , 2003, Occasional paper.
[20] Sergi Jordà,et al. Digital Instruments and Players: Part I - Efficiency and Apprenticeship , 2004, NIME.
[21] Lawrence R. Rabiner,et al. A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.
[22] Stefania Serafin,et al. The sound of friction: Real-time models, playability and musical applications , 2004 .
[23] Thomas B. Moeslund,et al. A procedure for developing intuitive and ergonomic gesture interfaces for man-machine interaction , 2003 .
[24] Sergi Jordà,et al. Digital Instruments and Players: Part II-Diversity, Freedom and Control , 2004, ICMC.
[25] Xavier Rodet,et al. Spectral Envelope Estimation and Representation for Sound Analysis-Synthesis , 1999, ICMC.
[26] Joseph A. Paradiso,et al. Passive acoustic knock tracking for interactive windows , 2002, CHI Extended Abstracts.
[27] Jeremy R. Cooperstock,et al. Enabling gestural interaction by means of tracking dynamical systems models and assistive feedback , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.
[28] Sam Savage. Interactive Simulation , 2006, Proceedings of the 2006 Winter Simulation Conference.
[29] Philip H. Mirvis. Flow: The Psychology of Optimal Experience , 1991 .
[30] Norbert Schnell,et al. Playing the "MO" - Gestural Control and Re-Embodiment of Recorded Sound and Music , 2011, NIME.
[31] Perry R. Cook,et al. Human model evaluation in interactive supervised learning , 2011, CHI.
[32] Desney S. Tan,et al. Humantenna: using the body as an antenna for real-time whole-body interaction , 2012, CHI.
[33] Yang Li,et al. User-defined motion gestures for mobile interaction , 2011, CHI.
[34] Hiroshi Ishii,et al. Audiopad: A Tag-based Interface for Musical Performance , 2002, NIME.
[35] Serge Lemouton,et al. The Augmented String Quartet: Experiments and Gesture Following , 2012 .
[36] Norbert Schnell,et al. Mapping Through Listening , 2014, Computer Music Journal.
[37] Robert Xiao,et al. Acoustic barcodes: passive, durable and inexpensive notched identification tags , 2012, UIST.
[38] Andrew P. McPherson,et al. Dimensionality and Appropriation in Digital Musical Instrument Design , 2014, NIME.
[39] Kjetil Falkenberg Hansen. The turntable: The instrument of hip-hop , 2015 .
[40] Jonathan Savage,et al. DubDubDub: improvisation using the sounds of the World Wide Web , 2007 .
[41] D. Schwarz,et al. Corpus-Based Concatenative Synthesis , 2007, IEEE Signal Processing Magazine.
[42] Rama Chellappa,et al. Machine Recognition of Human Activities: A Survey , 2008, IEEE Transactions on Circuits and Systems for Video Technology.
[43] Andrew P. McPherson,et al. The space between the notes: adding expressive pitch control to the piano keyboard , 2013, CHI.
[44] Joseph A. Paradiso,et al. Tracking and characterizing knocks atop large interactive displays , 2005 .
[45] Jennie Carroll,et al. Completing design in use: closing the appropriation cycle , 2004, ECIS.
[46] Meredith Ringel Morris,et al. User-defined gestures for surface computing , 2009, CHI.
[47] Pietro Polotti,et al. Tangible Acoustic Interfaces and their Applications for the Design of New Musical Instruments , 2005, NIME.
[48] Nisha Checka,et al. A system for tracking and characterizing acoustic impacts on large interactive surfaces , 2001 .
[49] Chris Harrison,et al. Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces , 2008, UIST '08.
[50] Desney S. Tan,et al. Skinput: appropriating the body as an input surface , 2010, CHI.
[51] Davide Rocchesso,et al. Gamelunch: forging a dining experience through sound , 2008, CHI Extended Abstracts.
[52] Judith C. Brown. Calculation of a constant Q spectral transform , 1991 .
[53] Sile O'Modhrain,et al. An enactive approach to the design of new tangible musical instruments , 2006, Organised Sound.
[54] A. Clark,et al. The Extended Mind , 1998, Analysis.
[55] M. Sile O'Modhrain,et al. PebbleBox and CrumbleBag: Tactile Interfaces for Granular Synthesis , 2004, NIME.
[56] Rebecca Fiebrink. Real-time interaction with supervised learning , 2010, CHI EA '10.
[57] Cédric Bornand,et al. Transforming Daily Life Objects into Tactile Interfaces , 2008, EuroSSC.
[58] Matthew Wright,et al. Problems and Prospects for Intimate Musical Control of Computers , 2002, Computer Music Journal.
[59] Gianpaolo Evangelista,et al. EURASIP Journal on Applied Signal Processing 2004:7, 964–977 c ○ 2004 Hindawi Publishing Corporation Physically Inspired Models for the Synthesis of Stiff Strings with Dispersive Waveguides , 2003 .
[60] Mark T. Marshall,et al. The augmentalist: enabling musicians to develop augmented musical instruments , 2011, Tangible and Embedded Interaction.
[61] Aaron F. Bobick,et al. Parametric Hidden Markov Models for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..
[62] Sagnik Sinha,et al. Pitch tracking of acoustic signals based on average squared mean difference function , 2007, Signal Image Video Process..
[63] Joseph A. Paradiso,et al. Passive acoustic sensing for tracking knocks atop large interactive displays , 2002, Proceedings of IEEE Sensors.
[64] Olivier Chapuis,et al. Using rhythmic patterns as an input method , 2012, CHI.
[65] Mathias Fink,et al. Acoustic time-reversal mirrors , 2001 .
[66] Miller Puckette. Infuriating Nonlinear Reverberator , 2011, ICMC.
[67] Davide Rocchesso,et al. The Sounding Object , 2002 .
[68] Ayah Bdeir,et al. Electronics as material: littleBits , 2010, TEI.
[69] David G. Long,et al. Array signal processing , 1985, IEEE Trans. Acoust. Speech Signal Process..
[70] Jay Lee,et al. Bottles as a minimal interface to access digital information , 2001, CHI Extended Abstracts.
[71] Chris Harrison,et al. OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.
[72] David Shaw,et al. Makey Makey: improvising tangible and nature-based user interfaces , 2012, TEI.
[73] Hayes Raffle,et al. The sound of touch: physical manipulation of digital sound , 2008, CHI.
[74] Carl Gutwin,et al. Learning from Games: HCI Design Innovations in Entertainment Software , 2003, Graphics Interface.
[75] Miha Ciglar. An Ultrasound Based Instrument Generating Audible and Tactile Sound , 2010, NIME.
[76] Norbert Schnell,et al. MnM: a Max/MSP mapping toolbox , 2005, NIME.
[77] Daniel Schlessinger,et al. The Kalichord: A Physically Modeled Electro-Acoustic Plucked String Instrument , 2009, NIME.
[78] Norbert Schnell,et al. Wireless sensor interface and gesture-follower for music pedagogy , 2007, NIME '07.
[79] Andrew P. McPherson,et al. Optical Measurement of Acoustic Drum Strike Locations , 2014, NIME.
[80] Desney S. Tan,et al. Your noise is my command: sensing gestures using the body as an antenna , 2011, CHI.
[81] Rafael Ballagas,et al. Unravelling seams: improving mobile gesture recognition with visual feedback techniques , 2009, CHI.
[82] Hari Balakrishnan,et al. 6th ACM/IEEE International Conference on on Mobile Computing and Networking (ACM MOBICOM ’00) The Cricket Location-Support System , 2022 .
[83] John Williamson,et al. Continuous uncertain interaction , 2006 .
[84] Julius O. Smith,et al. Extensions of the Karplus-Strong Plucked-String Algorithm , 1983 .
[85] Panu Korpipää,et al. ActionCube: a tangible mobile gesture interaction tutorial , 2008, Tangible and Embedded Interaction.
[86] Sergi Jordà,et al. The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces , 2007, TEI.
[87] Youngmoo E. Kim,et al. The Problem of the Second Performer: Building a Community Around an Augmented Piano , 2012, Computer Music Journal.
[88] Hideki Kawahara,et al. YIN, a fundamental frequency estimator for speech and music. , 2002, The Journal of the Acoustical Society of America.
[89] Nicola Orio,et al. Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI , 2001, Computer Music Journal.
[90] Roderick Murray-Smith,et al. Stane: synthesized surfaces for tactile input , 2008, CHI.
[91] Massimo Bergamasco,et al. Real-Time Gesture Recognition, Evaluation and Feed-Forward Correction of a Multimodal Tai-Chi Platform , 2008, HAID.
[92] S. Shyam Sundar,et al. What drives customization?: control or identity? , 2011, CHI.
[93] Joseph A. Paradiso,et al. The Chameleon Guitar—Guitar with a Replaceable Resonator , 2011 .
[94] Ming C. Lin,et al. Example-guided physically based modal sound synthesis , 2013, ACM Trans. Graph..
[95] Donald A. Norman,et al. The invisible computer , 1998 .
[96] Marcelo M. Wanderley,et al. Unsounding Objects: Audio Feature Extraction for the Control of Sound Synthesis , 2014, NIME.
[97] Hiroshi Ishii,et al. Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..
[98] Davide Rocchesso,et al. Interactive Simulation of rigid body interaction with friction-induced sound generation , 2005, IEEE Transactions on Speech and Audio Processing.
[99] Pedro Lopes,et al. Augmenting touch interaction through acoustic sensing , 2011, ITS '11.
[100] Paul Dourish,et al. Where the action is , 2001 .
[101] Ming Yang,et al. A Novel Human-Computer Interface Based on Passive Acoustic Localisation , 2007, HCI.
[102] Chris Harrison,et al. TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.
[103] Joseph A. Paradiso,et al. Swept-frequency, magnetically-coupled resonant tags for realtime, continuous, multiparameter control , 1999, CHI EA '99.
[104] F. Richard Moore,et al. The Dysfunctions of MIDI , 1988, ICMC.
[105] Susan Gasson,et al. HUMAN-CENTERED VS. USER-CENTERED APPROACHES TO INFORMATION SYSTEM DESIGN , 2003 .
[106] Serge Lemouton,et al. The augmented violin project: research, composition and performance report , 2006, NIME.
[107] J. B. Brooke,et al. SUS: A 'Quick and Dirty' Usability Scale , 1996 .
[108] John Richards,et al. Beyond DIY in Electronic Music , 2013, Organised Sound.
[109] Steve Mann. Natural interfaces for musical expression: physiphones and a physics-based organology , 2007, NIME '07.
[110] Gil Weinberg,et al. The Squeezables: Toward an Expressive and Interdependent Multi-player Musical Instrument , 2001, Computer Music Journal.
[111] Scott R. Klemmer,et al. Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.
[112] Joseph A. Paradiso,et al. PingPongPlus: design of an athletic-tangible interface for computer-supported cooperative play , 1999, CHI '99.
[113] Julius O. Smith,et al. Physical Modeling Using Digital Waveguides , 1992 .
[114] Davide Rocchesso,et al. A toolkit for explorations in sonic interaction design , 2010, Audio Mostly Conference.
[115] Jerry Alan Fails,et al. A design tool for camera-based interaction , 2003, CHI '03.
[116] Perry R. Cook,et al. Principles for Designing Computer Music Controllers , 2001, NIME.
[117] Atau Tanaka,et al. Adaptive Gesture Recognition with Variation Estimation for Interactive Systems , 2014, ACM Trans. Interact. Intell. Syst..
[118] Axel Röbel,et al. MuBu and Friends - Assembling Tools for Content Based Real-Time Interactive Audio Processing in Max/MSP , 2009, ICMC.
[119] Norbert Schnell,et al. Continuous Realtime Gesture Following and Recognition , 2009, Gesture Workshop.
[120] Perry R. Cook,et al. Toward physically-informed parametric synthesis of sound effects , 1999, Proceedings of the 1999 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics. WASPAA'99 (Cat. No.99TH8452).
[121] Thor Magnusson,et al. Epistemic tools : the phenomenology of digital musical instruments , 2009 .
[122] Jerry Alan Fails,et al. Interactive machine learning , 2003, IUI '03.
[123] Graham Pullin,et al. Tactophonics: your favourite thing wants to sing , 2007, NIME '07.
[124] Alan J. Dix,et al. Designing for appropriation , 2007, BCS HCI.