How to design and build new musical interfaces
暂无分享,去创建一个
[1] Roel Vertegaal,et al. Comparison of Input Devices in an ISEE Direct Timbre Manipulation Task , 1996, Interact. Comput..
[2] Dinesh K. Pai,et al. JASS: A JAVA AUDIO SYNTHESIS SYSTEM FOR PROGRAMMERS , 2001 .
[3] Perry R. Cook,et al. A Meta-Wind-Instrument Physical Model, and a Meta-Controller for Real-Time Performance Control , 1992, ICMC.
[4] Robert Huott,et al. An Interface for Precise Musical Control , 2002, NIME.
[5] Sergi Jordà,et al. Sonigraphical Instruments: From FMOL to the reacTable* , 2003, NIME.
[6] James A. Moorer,et al. The Use of the Phase Vocoder in Computer Music Applications , 1976 .
[7] Michael J. Lyons,et al. Facing the music: a facial action controlled musical interface , 2001, CHI Extended Abstracts.
[8] Sidney S. Fels,et al. MetaMuse: a novel control metaphor for granular synthesis , 2002, CHI Extended Abstracts.
[9] Jean Laroche,et al. Multichannel excitation/filter modeling of percussive sounds with application to the piano , 1994, IEEE Trans. Speech Audio Process..
[10] Marc Le Brun,et al. Digital Waveshaping Synthesis , 1979 .
[11] Christoph Bartneck,et al. HCI and the Face: Towards an Art of the Soluble , 2007, HCI.
[12] George Tzanetakis,et al. Blending the physical and the virtual in music technology: from interface design to multi-modal signal processing , 2013, ACM Multimedia.
[13] Matthew Wright,et al. Open SoundControl: A New Protocol for Communicating with Sound Synthesizers , 1997, ICMC.
[14] Miss A.O. Penney. (b) , 1974, The New Yale Book of Quotations.
[15] Sergi Jordà,et al. Digital Instruments and Players: Part I - Efficiency and Apprenticeship , 2004, NIME.
[16] Eric Singer,et al. Sonic Banana: A Novel Bend-Sensor-Based MIDI Controller , 2003, NIME.
[17] Tom Igoe,et al. Making Things Talk , 2007 .
[18] Michael J. Lyons. Facial gesture interfaces for expression and communication , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).
[19] Shuji Hashimoto,et al. EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems , 2000, Computer Music Journal.
[20] M. Sile O'Modhrain,et al. PebbleBox and CrumbleBag: Tactile Interfaces for Granular Synthesis , 2004, NIME.
[21] Florian Vogt,et al. Tooka: Exploration of Two Person Instruments , 2002, NIME.
[22] Michael J. Lyons,et al. Designing, Playing, and Performing with a Vision-based Mouth Interface , 2003, NIME.
[23] Loïc Kessous,et al. Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces , 2002, Organised Sound.
[24] Sidney S. Fels,et al. Walk the Walk, Talk the Talk , 2008, 2008 12th IEEE International Symposium on Wearable Computers.
[25] Marcelo M. Wanderley,et al. Mapping performer parameters to synthesis engines , 2002, Organised Sound.
[26] Norbert Schnell,et al. Proceedings of the 2006 conference on New interfaces for musical expression , 2006 .
[27] Sidney S. Fels,et al. GRASSP: Gesturally-Realized Audio, Speech and Song Performance , 2006, NIME.
[28] Perry R. Cook,et al. Principles for Designing Computer Music Controllers , 2001, NIME.
[29] Florian Vogt,et al. Tongue 'n' Groove: An Ultrasound based Music Controller , 2002, NIME.
[30] Mark T. Marshall,et al. Physical Interface Design for Digital Musical Instruments , 2009 .
[31] I. Xenakis,et al. Formalized Music: Thought and Mathematics in Composition , 1971 .
[32] Sidney S. Fels,et al. Iamascope: a graphical musical instrument , 1999, Comput. Graph..
[33] Perry R. Cook,et al. TBone: An Interactive WaveGuide Brass Instrument Synthesis Workbench for the NeXT Machine , 1991, ICMC.
[34] Cléo Palacio-Quintin,et al. The Hyper-Flute , 2003, NIME.
[35] Ge Wang,et al. Designing Smule's Ocarina: The iPhone's Magic Flute , 2009, NIME.
[36] Dan Overholt,et al. The MATRIX: A Novel Controller for Musical Expression , 2001, NIME.
[37] Camille Goudeseune,et al. Interpolated mappings for musical instruments , 2002, Organised Sound.
[38] Roel Vertegaal,et al. Quarterly Progress and Status Report Towards a musician’s cockpit: Transducers, feedback and musical function , 2007 .
[39] Marcelo M. Wanderley,et al. The Importance of Parameter Mapping in Electronic Instrument Design , 2002, NIME.
[40] Axel G. E. Mulder. Towards a choice of gestural constraints for instrumental performers , 2000 .
[41] Curtis Roads,et al. The Computer Music Tutorial , 1996 .
[42] Phil Clendeninn. The Vocoder , 1940, Nature.
[43] Kazuhiro Kuwabara,et al. Sonification of Facial Actions for Musical Expression , 2005, NIME.
[44] Stefania Serafin,et al. Proceedings of the International Computer Music Conference , 2007 .
[45] Michael J. Lyons,et al. Interaction and Music Technology , 2011, INTERACT.
[46] John M. Chowning,et al. The Synthesis of Complex Audio Spectra by Means of Frequency Modulation , 1973 .
[47] Kenneth Steiglitz,et al. A digital signal processing primer - with applications to digital audio and computer music , 1996 .
[48] Michael J. Lyons. Sketches & Applications Contributors Michael Haehnel Nobujitetsutani Skapps_0079 the Mouthesizer: a Facial Gesture Musical Interface , 2001 .
[49] Michael J. Lyons,et al. Creating new interfaces for musical expression: introduction to NIME , 2009, SIGGRAPH '09.
[50] Arthur C. Clarke,et al. Designing Smule’s iPhone Ocarina , 2009 .
[51] Marcelo M. Wanderley,et al. Trends in Gestural Control of Music , 2000 .
[52] Michael J. Lyons,et al. Introduction to designing and building musical interfaces , 2014, CHI Extended Abstracts.
[53] Sidney S. Fels,et al. MetaMuse: Metaphors for Expressive Instruments , 2002, NIME.
[54] Marcelo M. Wanderley,et al. New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series) , 2006 .
[55] Perry R. Cook,et al. "On-the-fly Programming: Using Code as an Expressive Musical Instrument" , 2004, NIME.
[56] Atau Tanaka,et al. Multimodal Interaction in Music Using the Electromyogram and Relative Position Sensing , 2002, NIME.
[57] Darren M. Simon,et al. “ SPASM : A Real-time Vocal Tract Physical Model , 2017 .
[58] Annett Baier,et al. Current Directions In Computer Music Research , 2016 .
[59] D. Gabor. Acoustical Quanta and the Theory of Hearing , 1947, Nature.
[60] Joseph A. Paradiso,et al. Musical Applications of Electric Field Sensing , 1997 .
[61] Craig Stuart Sapp,et al. A Course on Controllers , 2001, NIME.
[62] Michael J. Lyons,et al. Ambient Display using Musical Effects , 2006, IUI '06.
[63] Tina Blaine,et al. The Jam-O-Drum interactive music system: a study in interaction design , 2000, DIS '00.
[64] Golan Levin,et al. Sounds from Shapes: Audiovisual Performance with Hand Silhouette Contours in The Manual Input Sessions , 2005, NIME.
[65] Hans-Christoph Steiner,et al. Towards a catalog and software library of mapping methods , 2006, NIME.
[66] Perry R. Cook,et al. LECTOR: An Ecclesiastical Latin Control Language for the SPASM/singer Instrument , 1991, ICMC.
[67] Björn Hartmann,et al. OROBORO: a collaborative controller with interpersonal haptic feedback , 2005 .
[68] Michel Waisvisz,et al. The Hands: A Set of Remote MIDI-Controllers , 1985, ICMC.
[69] David Wessel,et al. Timbre Space as a Musical Control Structure , 1979 .
[70] Michael J. Lyons,et al. Design and Implementation of a Mobile Exergaming Platform , 2009, INTETAIN.
[71] Diana Young,et al. The Hyperbow Controller: Real-Time Dynamics Measurement of Violin Performance , 2002, NIME.
[72] Julius O. Smith,et al. PARSHL: An Analysis/Synthesis Program for Non-Harmonic Sounds Based on a Sinusoidal Representation , 1987, ICMC.
[73] Axel G. E. Mulder. Sound Sculpting : Performing with Virtual Musical Instruments , 1998 .
[74] Perry R. Cook,et al. SPASM, a Real-Time Vocal Tract Physical Model Controller; and Singer, the Companion Software Synthesis System , 1993 .
[75] James A. Moorer,et al. The Use of Linear Prediction of Speech in Computer Music Applications , 1979 .
[76] Unto K. Laine,et al. Transmission-Line Modeling and Real-Time Synthesis of String and Wind Instruments , 1991, ICMC.
[77] Thomas F. Quatieri,et al. Speech analysis/Synthesis based on a sinusoidal representation , 1986, IEEE Trans. Acoust. Speech Signal Process..
[78] Julius O. Smith,et al. Music applications of digital waveguides , 1987 .
[79] Don Buchla,et al. A History of Buchla's Musical Instruments , 2005, NIME.
[80] K. Roberts,et al. Thesis , 2002 .
[81] Leonello Tarabella,et al. Music, communication, technology , 2005 .
[82] Perry R. Cook,et al. Remutualizing the Musical Instrument: Co-Design of Synthesis Algorithms and Controllers , 2004 .
[83] David Merrill,et al. Head-Tracking for Gestural and Continuous Control of Parameterized Audio Effects , 2003, NIME.
[84] F. Richard Moore,et al. The Dysfunctions of MIDI , 1988, ICMC.
[85] Sudhir Gupta,et al. Case Studies , 2013, Journal of Clinical Immunology.
[86] Mark Dolson,et al. The Phase Vocoder: A Tutorial , 1986 .
[87] Sidney Fels,et al. Mapping transparency through metaphor: towards more expressive musical instruments , 2002, Organised Sound.
[88] J. Makhoul,et al. Linear prediction: A tutorial review , 1975, Proceedings of the IEEE.
[89] R. Benjamin Knapp,et al. A Bioelectric Controller for Computer Music Applications , 1990 .
[90] Michael J. Lyons,et al. Supporting Empathy in Online Learning with Artificial Expressions , 2005, J. Educ. Technol. Soc..
[91] Curtis Roads,et al. Asynchronous granular synthesis , 1991 .
[92] Matthew Wright,et al. Problems and Prospects for Intimate Musical Control of Computers , 2002, Computer Music Journal.
[93] Takuro Mizuta Lippit,et al. Realtime Sample System for the Turntablist version 2: 16padjoystickcontroller , 2004, NIME.
[94] Gideon D'Arcangelo,et al. Recycling Music, Answering Back: Toward an Oral Tradition of Electronic Music , 2004, NIME.
[95] G. G. Stokes. "J." , 1890, The New Yale Book of Quotations.
[96] Yoichi Nagashima,et al. Bio-Sensing Systems and Bio-Feedback Systems for Interactive Media Arts , 2003, NIME.
[97] Joel Chadabe,et al. The Limitations of Mapping and a Structural Descriptive in Electronic Instruments , 2002, NIME.
[98] Michael J. Lyons,et al. A Novel Face-tracking Mouth Controller and its Application to Interacting with Bioacoustic Models , 2004, NIME.
[99] R. T. Schumacher,et al. ON THE OSCILLATIONS OF MUSICAL-INSTRUMENTS , 1983 .
[100] Michael J. Lyons,et al. Creating new interfaces for musical expression , 2013, SA '13.
[101] Peter F. Driessen,et al. AUDIO-BASED GESTURE EXTRACTION ON THE ESITAR CONTROLLER , 2004 .
[102] Joseph A. Paradiso,et al. The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance , 1999 .
[103] Ivan Poupyrev,et al. New interfaces for musical expression , 2001, CHI Extended Abstracts.
[104] John Wawrzynek,et al. VLSI models for sound synthesis , 1989 .
[105] Alexander Refsum Jensenius,et al. SIG NIME: music, technology, and human-computer interaction , 2013, CHI Extended Abstracts.
[106] Marcelo M. Wanderley,et al. Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance , 1997 .
[107] Geoffrey E. Hinton,et al. Glove-TalkII-a neural-network interface which maps gestures to parallel formant speech synthesizer controls , 1997, IEEE Trans. Neural Networks.
[108] Sidney S. Fels,et al. Evolving Tooka: from Experiment to Instrument , 2004, NIME.
[109] M. Gribaudo,et al. 2002 , 2001, Cell and Tissue Research.
[110] Max V. Mathews,et al. Scanned Synthesis , 2000, ICMC.
[111] Victor Lazzarini. Erratum: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard , 2008, Computer Music Journal.
[112] Yusuke Kimura. Nonabelian gauge field and dual description of fuzzy sphere , 2004 .
[113] Sidney Fels,et al. Collaborative Musical Experiences for Novices , 2003 .
[114] Barry Truax,et al. Real-Time Granular Synthesis with a Digital Signal Processor , 1988 .
[115] Robert A. Boie,et al. The Radio Drum as a Synthesizer Controller , 1989, ICMC.