Culture and Politics of Machine Learning in NIME: A Preliminary Qualitative Inquiry
暂无分享,去创建一个
[1] The Gender Gap and the Computer Music Narrative - On the Under- Representation ofWomen at Computer Music Conferences , 2021, array. the journal of the ICMA.
[2] William Agnew,et al. The Values Encoded in Machine Learning Research , 2021, FAccT.
[3] Koray Tahiroğlu,et al. AI-terity 2.0: An Autonomous NIME Featuring GANSpaceSynth Deep Learning Model , 2021, NIME.
[4] Raul Masu,et al. NIME and the Environment: Toward a More Sustainable NIME Practice , 2021, NIME.
[5] Jon Gillick,et al. What to Play and How to Play it: Guiding Generative Music Models with Multiple Demonstrations , 2021, NIME.
[6] Charles Patrick Martin,et al. A Laptop Ensemble Performance System using Recurrent Neural Networks , 2020, NIME.
[7] Baptiste Caramiaux,et al. Artificial Intelligence in Music and Performance: A Subjective Art-Research Inquiry , 2020, Handbook of Artificial Intelligence for Music.
[8] Andrew P. McPherson,et al. A NIME Of The Times: Developing an Outward-Looking Political Agenda For This Community , 2020, NIME.
[9] Shakir Mohamed,et al. Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence , 2020, Philosophy & Technology.
[10] Adnan Marquez-Borbon,et al. Addressing NIME's Prevailing Sociotechnical, Political, and Epistemological Exigencies , 2020, Computer Music Journal.
[11] Rebecca Fiebrink,et al. Reflections on Eight Years of Instrument Creation with Machine Learning , 2020, NIME.
[12] Atau Tanaka,et al. Digital Musical Instruments as Probes: How computation changes the mode-of-being of musical instruments , 2020, Organised Sound.
[13] S. Merz. Race after technology. Abolitionist tools for the new Jim Code , 2020, Ethnic and Racial Studies.
[14] Alexandre Lacoste,et al. Quantifying the Carbon Emissions of Machine Learning , 2019, ArXiv.
[15] Sarah Fdili Alaoui,et al. Making an Interactive Dance Piece: Tensions in Integrating Technology in Art , 2019, Conference on Designing Interactive Systems.
[16] Os Keyes,et al. Human-Computer Insurrection: Notes on an Anarchist HCI , 2019, CHI.
[17] Anna Xambó,et al. Who Are the Women Authors in NIME?-Improving Gender Balance in NIME Research , 2018, NIME.
[18] Tom Feltwell,et al. "Grand Visions" for Post-Capitalist Human-Computer Interaction , 2018, CHI Extended Abstracts.
[19] Safiya Noble,et al. Algorithms of Oppression , 2018 .
[20] Fabio Morreale,et al. Design for longevity: ongoing use of instruments from nime 2010-14 , 2017, NIME.
[21] Ali Momeni,et al. Ml.lib: robust, cross-platform, open-source machine learning for max and pure data , 2015, NIME.
[22] Shaowen Bardzell,et al. Feminist HCI: taking stock and outlining an agenda for design , 2010, CHI.
[23] P. Janata,et al. Embodied music cognition and mediation technology , 2009 .
[24] V. Braun,et al. Using thematic analysis in psychology , 2006 .
[25] Georg Essl,et al. On gender in new music interface technology , 2003, Organised Sound.
[26] Richard E. Boyatzis,et al. Transforming Qualitative Information: Thematic Analysis and Code Development , 1998 .
[27] Geoffrey E. Hinton,et al. Glove-TalkII: an adaptive gesture-to-formant interface , 1995, CHI '95.
[28] D. Ihde. Technology and the lifeworld : from garden to earth , 1991 .
[29] B. Latour. Technology is Society Made Durable , 1990 .
[30] Anil Çamci,et al. Latent Drummer: A New Abstraction for Modular Sequencers , 2022, NIME.
[31] J. Jaimovich,et al. Being (A)part of NIME: Embracing Latin American Perspectives , 2022, NIME.
[32] Andrew Mcpherson,et al. Quantitative evaluation of aspects of embodiment in new digital musical instruments , 2022, NIME.
[33] A. Jensenius,et al. CAVI: A Coadaptive Audiovisual Instrument-Composition , 2022, NIME.
[34] Bob L. Sturm,et al. De-centering the West: East Asian Philosophies and the Ethics of Applying Artificial Intelligence to Music , 2021, ISMIR.
[35] Handbook of Artificial Intelligence for Music: Foundations, Advanced Approaches, and Developments for Creativity , 2021 .
[36] Christodoulos Benetatos,et al. BachDuet: A Deep Learning System for Human-Machine Counterpoint Improvisation , 2020, NIME.
[37] Jim Tørresen,et al. Parameterized Melody Generation with Autoencoders and Temporally-Consistent Noise , 2019, NIME.
[38] Charles Patrick Martin,et al. A Physical Intelligent Instrument using Recurrent Neural Networks , 2019, NIME.
[39] Joseph A. Paradiso,et al. The gesture recognition toolkit , 2014, J. Mach. Learn. Res..
[40] Paul Dourish,et al. Where the action is , 2001 .
[41] David Wessel,et al. Real-Time Neural Network Processing of Gestural and Acoustic Signals , 1991, ICMC.