Mapping Through Listening
暂无分享,去创建一个
Norbert Schnell | Jules Françoise | Baptiste Caramiaux | Frédéric Bevilacqua | Frédéric Bevilacqua | Baptiste Caramiaux | Jules Françoise | Norbert Schnell
[1] Tuomas Eerola,et al. Formulating a Revised Taxonomy for Modes of Listening , 2012 .
[2] Stefan Kopp,et al. Gesture in embodied communication and human-computer interaction : 8th International Gesture Workshop, GW 2009, Bielefeld, Germany, February 25-27, 2009 : revised selected papers , 2010 .
[3] William W. Gaver. How Do We Hear in the World?: Explorations in Ecological Acoustics , 1993 .
[4] Norbert Schnell,et al. Analysing Gesture and Sound Similarities with a HMM-based Divergence Measure , 2010 .
[5] Norbert Schnell,et al. Online Gesture Analysis and Control of Audio Processing , 2011 .
[6] Baptiste Caramiaux,et al. Realtime Segmentation and Recognition of Gestures Using Hierarchical Markov Models , 2022 .
[7] Marcelo M. Wanderley,et al. Trends in Gestural Control of Music , 2000 .
[8] Perry R. Cook,et al. Real-time human interaction with supervised learning algorithms for music composition and performance , 2011 .
[9] Marcelo M. Wanderley,et al. On the Choice of Mappings Based On Geometric Properties , 2004, NIME.
[10] Michel Chion. Guide des objets sonores : Pierre Schaeffer et la recherche musicale , 1983 .
[11] Diemo Schwarz,et al. Scalability in Content-Based Navigation of Sound Databases , 2009, ICMC.
[12] E. Saltzman,et al. The Power of Listening: Auditory‐Motor Interactions in Musical Training , 2005, Annals of the New York Academy of Sciences.
[13] Jens Haueisen,et al. Involuntary Motor Activity in Pianists Evoked by Music Perception , 2001, Journal of Cognitive Neuroscience.
[14] Nicola Orio,et al. Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI , 2001, Computer Music Journal.
[15] Edward W. Large,et al. Perceiving temporal regularity in music , 2002, Cogn. Sci..
[16] Mireille Besson,et al. Temporal Semiotic Units as minimal meaningful units in music? An electrophysiological approach. , 2009 .
[17] Ali Momeni,et al. Dynamic Independent Mapping Layers for Concurrent Control of Audio and Video Synthesis , 2006, Computer Music Journal.
[18] Pieter-Jan Maes,et al. An empirical study of embodied music listening, and its applications in mediation technology , 2012 .
[19] Jim Tørresen,et al. Analyzing sound tracings: a multimodal approach to music information retrieval , 2011, MIRUM '11.
[20] Atau Tanaka,et al. Machine Learning of Musical Gestures , 2013, NIME.
[21] Marcelo M. Wanderley,et al. Gestural control of sound synthesis , 2004, Proceedings of the IEEE.
[22] Norbert Schnell,et al. Towards a Gesture-Sound Cross-Modal Analysis , 2009, Gesture Workshop.
[23] Loïc Kessous,et al. Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces , 2002, Organised Sound.
[24] Marina Basu. The Embodied Mind: Cognitive Science and Human Experience , 2004 .
[25] M. Leman. Embodied Music Cognition and Mediation Technology , 2007 .
[26] Claude Cadoz,et al. Haptics in computer music : a paradigm shift , 2004, ArXiv.
[27] Kia Ng,et al. Musical Robots and Interactive Multimodal Systems , 2011 .
[28] Norbert Schnell,et al. A multimodal probabilistic model for gesture--based control of sound synthesis , 2013, MM '13.
[29] Alva Noë,et al. Action in Perception , 2006, Representation and Mind.
[30] A. Liberman,et al. The motor theory of speech perception revised , 1985, Cognition.
[31] Claude Cadoz,et al. Instrumental Gestures and Musical Composition , 1988, ICMC.
[32] Marcelo M. Wanderley,et al. Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance , 1997 .
[33] Norbert Schnell,et al. Modular musical objects towards embodied control of digital music , 2011, Tangible and Embedded Interaction.
[34] Marc Leman,et al. Sharing Musical Expression Through Embodied Listening: A Case Study Based on Chinese Guqin Music , 2009 .
[35] William W. Gaver. What in the World Do We Hear? An Ecological Approach to Auditory Event Perception , 1993 .
[36] R. Zatorre,et al. When the brain plays music: auditory–motor interactions in music perception and production , 2007, Nature Reviews Neuroscience.
[37] Adrien Merer. Caractérisation acoustique et perceptive du mouvement évoqué par les sons pour le contrôle de la synthèse , 2011 .
[38] Pierre Schaeffer. Traité des objets musicaux , 1966 .
[39] Perry R. Cook,et al. Human model evaluation in interactive supervised learning , 2011, CHI.
[40] Patrick Susini,et al. The Role of Sound Source Perception in Gestural Sound Description , 2014, TAP.
[41] Rolf Inge Godøy,et al. Gestural-Sonorous Objects: embodied extensions of Schaeffer's conceptual apparatus , 2006, Organised Sound.
[42] G. Rizzolatti,et al. Speech listening specifically modulates the excitability of tongue muscles: a TMS study , 2002, The European journal of neuroscience.
[43] Doug Van Nort,et al. Instrumental Listening: sonic gesture as design principle , 2009, Organised Sound.
[44] Atau Tanaka,et al. Adaptive Gesture Recognition with Variation Estimation for Interactive Systems , 2014, ACM Trans. Interact. Intell. Syst..
[45] Norbert Schnell,et al. Continuous Realtime Gesture Following and Recognition , 2009, Gesture Workshop.
[46] E. Large. On synchronizing movements to music , 2000 .
[47] Jules Françoise,et al. A HIERARCHICAL APPROACH FOR THE DESIGN OF GESTURE-TO-SOUND MAPPINGS , 2012 .
[48] Alexander Refsum Jensenius,et al. Exploring Music-Related Gestures by Sound-Tracing - A Preliminary Study , 2006 .
[49] Ross Kirk,et al. Mapping Strategies for Musical Performance , 2000 .
[50] Mats B. Küssner,et al. Music and shape , 2013, Lit. Linguistic Comput..
[51] M. Merleau-Ponty. Phénoménologie de la perception , 1950 .
[52] Marcelo M. Wanderley,et al. New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series) , 2006 .