Computational Models of Expressive Music Performance: A Comprehensive and Critical Review
暂无分享,去创建一个
Gerhard Widmer | Carlos Eduardo Cancino Chacón | Maarten Grachten | Werner Goebl | G. Widmer | W. Goebl | Carlos Eduardo Cancino-Chacón | M. Grachten
[1] D. Moelants,et al. Exploring the effect of tempo changes on violinists’ body movements , 2019 .
[2] Maarten Grachten,et al. A Computational Study of the Role of Tonal Tension in Expressive Piano Performance , 2018, ArXiv.
[3] Dominic McIver Lopes,et al. Hearing and Seeing Musical Expression , 2009 .
[4] W. Goebl,et al. Communication for coordination: gesture kinematics and conventionality affect synchronization success in piano duos , 2017, Psychological Research.
[5] W. Goebl,et al. Beating time: How ensemble musicians’ cueing gestures communicate beat position and tempo , 2017, Psychology of music.
[6] Katerina Kosta,et al. Mapping between dynamic markings and performed loudness: a machine learning approach , 2016, Machine Learning and Music Generation.
[7] Sergio Giraldo,et al. A machine learning approach to ornamentation modeling and synthesis in jazz guitar , 2016, Machine Learning and Music Generation.
[8] Martin Bonev,et al. The ACCompanion v0.1: An Expressive Accompaniment System , 2017, ArXiv.
[9] Mark D. Plumbley,et al. Clustering Expressive Timing with Regressed Polynomial Coefficients Demonstrated by a Model Selection Test , 2017, ISMIR.
[10] Ching-Hua Chuan,et al. A Functional Taxonomy of Music Generation Systems , 2017, ACM Comput. Surv..
[11] M. Leman,et al. Introduction : What Is Embodied Music Interaction? , 2017 .
[12] Gerhard Widmer,et al. What were you expecting? Using Expectancy Features to Predict Expressive Performances of Classical Piano Music , 2017, ArXiv.
[13] Marcelo M. Wanderley,et al. Individuality in Piano Performance Depends on Skill Learning , 2017, MOCO.
[14] Carlos Eduardo Cancino Chacón,et al. Temporal Dependencies in the Expressive Timing of Classical Piano Performances , 2017 .
[15] Anders Friberg,et al. Predicting the perception of performed dynamics in music audio with ensemble learning. , 2017, The Journal of the Acoustical Society of America.
[16] Wil M. P. van der Aalst,et al. Business Process Variability Modeling , 2017, ACM Comput. Surv..
[17] Gerhard Widmer,et al. An evaluation of linear and non-linear models of expressive dynamics in classical piano and symphonic music , 2017, Machine Learning.
[18] Sergio Canazza,et al. Algorithms can Mimic Human Piano Performance: The Deep Blues of Music , 2017 .
[19] Sarvapali D. Ramchurn,et al. Algorithms for Graph-Constrained Coalition Formation in the Real World , 2017, TIST.
[20] Gerhard Widmer,et al. Toward Computer-Assisted Understanding of Dynamics in Symphonic Music , 2016, IEEE MultiMedia.
[21] Gerhard Widmer,et al. Getting Closer to the Essence of Music , 2016, ACM Trans. Intell. Syst. Technol..
[22] Marc Leman,et al. On the Role of the Hand in the Expression of Music , 2017, The Hand.
[23] Sander Dieleman,et al. Learning to Create Piano Performances , 2017 .
[24] H. Katayose,et al. CONSTRUCTING PEDB 2nd EDITION: A MUSIC PERFORMANCE DATABASE WITH PHRASE INFORMATION , 2017 .
[25] Eita Nakamura,et al. Performance Error Detection and Post-Processing for Fast and Accurate Symbolic Music Alignment , 2017, ISMIR.
[26] Cynthia C. S. Liem,et al. A Formalization of Relative Local Tempo Variations in Collections of Performances , 2017, ISMIR.
[27] Sergio I. Giraldo,et al. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music , 2016, Front. Psychol..
[28] François Pachet,et al. Maximum entropy models for generation of expressive music , 2016, ArXiv.
[29] S. McAdams,et al. Analysis, Performance, and Tension Perception of an Unmeasured Prelude for Harpsichord , 2016 .
[30] Álvaro Sarasúa,et al. Becoming the Maestro - A Game to Enhance Curiosity for Classical Music , 2016, 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES).
[31] Stefanie A. Wind,et al. Examining Rater Precision in Music Performance Assessment: An Analysis of Rating Scale Structure Using the Multifaceted Rasch Partial Credit Model , 2016 .
[32] Elaine Chew,et al. Tension ribbons: Quantifying and visualising tonal tension. , 2016 .
[33] Geraint A. Wiggins,et al. Linking melodic expectation to expressive performance timing and perceived musical tension. , 2016, Journal of experimental psychology. Human perception and performance.
[34] E. Chew. Playing with the Edge: Tipping Points and the Role of Tonality , 2016 .
[35] Carlos Eduardo Cancino-Chacón,et al. The Basis Mixer : A Computational Romantic Pianist , 2016 .
[36] Plumbley,et al. A model selection test on effective factors of the choice of expressive timing clusters for a phrase , 2016 .
[37] Matthias Abend. Cognitive Foundations Of Musical Pitch , 2016 .
[38] Rafael Ramírez,et al. Jazz Ensemble Expressive Performance Modeling , 2016, ISMIR.
[39] Eita Nakamura,et al. Autoregressive Hidden Semi-Markov Model of Symbolic Music Performance for Score Following , 2015, ISMIR.
[40] Roger B. Dannenberg,et al. Spectral Learning for Expressive Interactive Ensemble Music Performance , 2015, ISMIR.
[41] Alan Hanjalic,et al. Comparative Analysis of Orchestral Performance Recordings: An Image-Based Approach , 2015, ISMIR.
[42] Carlos Eduardo Cancino Chacón,et al. An Evaluation of Score Descriptors Combined with Non-linear Models of Expressive Dynamics in Music , 2015, Discovery Science.
[43] Markus Schedl,et al. PHENICX: Innovating the classical music experience , 2015, 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).
[44] Kenneth Sörensen,et al. Generating Fingerings for Polyphonic Piano Music with a Tabu Search Algorithm , 2015, MCM.
[45] Elaine Chew,et al. A Change-Point Approach Towards Representing Musical Dynamics , 2015, MCM.
[46] Mark D. Plumbley,et al. The Clustering of Expressive Timing Within a Phrase in Classical Piano Performances by Gaussian Mixture Models , 2015, CMMR.
[47] Roger B. Dannenberg,et al. Duet interaction: learning musicianship for automatic accompaniment , 2015, NIME.
[48] Sergio Canazza,et al. CaRo 2.0: An Interactive System for Expressive Music Rendering , 2015, Adv. Hum. Comput. Interact..
[49] D. Moelants,et al. The influence of tempo on expressive timing: a multimodal approach , 2015 .
[50] Larry A. Wasserman,et al. A Statistical View on the Expressive Timing of Piano Rolled Chords , 2015, ISMIR.
[51] Sergio Canazza,et al. The Role of Individual Difference in Judging Expressiveness of Computer-Assisted Music Performances by Experts , 2014, ACM Trans. Appl. Percept..
[52] Peter E. Keller,et al. A conceptual review on action-perception coupling in the musicians’ brain: what is it good for? , 2014, Front. Hum. Neurosci..
[53] Rafael Ramirez,et al. The Sense of Ensemble: a Machine Learning Approach to Expressive Performance Modelling in String Quartets , 2014 .
[54] Roger B. Dannenberg,et al. Methods and Prospects for Human–Computer Performance of Popular Music , 2014, Computer Music Journal.
[55] Hirokazu Kameoka,et al. Mixture of Gaussian process experts for predicting sung melodic contour with expressive dynamic fluctuations , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[56] Eita Nakamura,et al. A Stochastic Temporal Model of Polyphonic MIDI Performance with Ornaments , 2014, ArXiv.
[57] Yasuyuki Saito,et al. Outer-Product Hidden Markov Model and Polyphonic MIDI Score Following , 2014, ArXiv.
[58] Florian Krebs,et al. An Assessment of Learned Score Features for Modeling Expressive Dynamics in Music , 2014, IEEE Transactions on Multimedia.
[59] Elaine Chew,et al. Practical Implications of Dynamic Markings in the Score: Is Piano Always Piano? , 2014, Semantic Audio.
[60] Tadashi Kitamura,et al. Laminae: A stochastic modeling-based autonomous performance rendering system that elucidates performer characteristics , 2014, ICMC.
[61] Maarten Grachten,et al. Predicting Expressive Dynamics in Piano Performances using Neural Networks , 2014, ISMIR.
[62] Emery Schubert,et al. Open ended descriptions of computer assisted interpretations of musical performance : An investigation of individual differences , 2014 .
[63] Mark D. Plumbley,et al. Evidence that phrase-level tempo variation may be represented using a limited dictionary , 2014 .
[64] Anders Friberg,et al. Using computational models of music performance to model stylistic variations , 2014 .
[65] Sergio Canazza,et al. Music Systemisers and Music Empathisers - Do they rate expressiveness of computer generated performances the same? , 2014, ICMC.
[66] Anders Friberg,et al. Software tools for automatic music performance , 2014 .
[67] Eita Nakamura,et al. Merged-Output HMM for Piano Fingering of Both Hands , 2014, ISMIR.
[68] Chia-Jung Tsay. Sight over sound in the judgment of music performance , 2013, Proceedings of the National Academy of Sciences.
[69] Anders Friberg,et al. Emotional expression in music: contribution, linearity, and additivity of primary musical cues , 2013, Front. Psychol..
[70] Elaine Chew,et al. Conceptual and Experiential Representations of Tempo: Effects on Expressive Performance Comparisons , 2013, MCM.
[71] Satoru Fukayama,et al. Statistical Approach to Automatic Expressive Rendition of Polyphonic Piano Music , 2013, Guide to Computing for Expressive Music Performance.
[72] Alexis Kirke,et al. An Overview of Computer Systems for Expressive Music Performance , 2013, Guide to Computing for Expressive Music Performance.
[73] Atsuo Takanishi,et al. Anthropomorphic Musical Robots Designed to Produce Physically Embodied Expressive Performances of Music , 2013, Guide to Computing for Expressive Music Performance.
[74] Anders Friberg,et al. Systems for Interactive Control of Computer Generated Music Performance , 2013, Guide to Computing for Expressive Music Performance.
[75] Gerhard Widmer,et al. Expressive Performance Rendering with Probabilistic Models , 2013, Guide to Computing for Expressive Music Performance.
[76] Anders Friberg,et al. Evaluation of Computer Systems for Expressive Music Performance , 2013, Guide to Computing for Expressive Music Performance.
[77] Esteban Maestre,et al. Investigating the relationship between expressivity and synchronization in ensemble performance: an exploratory study , 2013 .
[78] Gerhard Widmer,et al. Linear Basis Models for Prediction and Analysis of Musical Expression , 2012 .
[79] Giovanni De Poli,et al. On Evaluating Systems for Generating Expressive Music Performance: the Rencon Experience , 2012 .
[80] Yann LeCun,et al. Moving Beyond Feature Design: Deep Architectures and Automatic Feature Learning in Music Informatics , 2012, ISMIR.
[81] Henri Ralambondrainy,et al. Score Analyzer: Automatically Determining Scores Difficulty Level for Instrumental e-Learning , 2012, ISMIR.
[82] Jean-Louis Giavitto,et al. Correct Automatic Accompaniment Despite Machine listening or Human errors in Antescofo , 2012, ICMC.
[83] Friedrich Platz,et al. When the Eye Listens: A Meta-analysis of How Audio-visual Presentation Enhances the Appreciation of Music Performance , 2012 .
[84] Elad Liebman,et al. A Phylogenetic Approach to Music Performance Analysis , 2012 .
[85] M. Farbood. A Parametric, Temporal Model of Musical Tension , 2012 .
[86] D. Moelants,et al. The Influence of an Audience on Performers: A Comparison Between Rehearsal and Concert Using Audio, Video and Movement Data , 2012 .
[87] Tetsuya Ogata,et al. A Musical Robot that Synchronizes with a Coplayer Using Non-Verbal Cues , 2012, Adv. Robotics.
[88] C. Raphael,et al. Modeling Piano Interpretation Using Switching Kalman Filter , 2012, ISMIR.
[89] G. Widmer,et al. Expressive Performance Rendering with Probabilistic Model , 2012 .
[90] Florian Krebs,et al. Combining Score And Filter Based Models To Predict Tempo Fluctuations In Expressive Music Performances , 2012 .
[91] Roberto Bresin,et al. Emotion rendering in music: Range and characteristic values of seven musical variables , 2011, Cortex.
[92] Guy Hoffman,et al. Interactive improvisation with a robotic marimba player , 2011, Auton. Robots.
[93] Alan Hanjalic,et al. Expressivity in Musical Timing in Relation to Musical Structure and Interpretation: A Cross-Performance, Audio-Based Approach , 2011, Semantic Audio.
[94] Caroline Palmer,et al. Rate Effects on Timing, Key Velocity, and Finger Kinematics in Piano Performance , 2011, PloS one.
[95] Satoru Fukayama,et al. Polyhymnia: An Automatic Piano Performance System with Statistical Modeling of Polyphonic Expression and Musical Symbol Interpretation , 2011, NIME.
[96] Marilyn Gail Boltz,et al. Illusory Tempo Changes Due to Musical Characteristics , 2011 .
[97] J. Sloboda,et al. Handbook of Music and Emotion: Theory, Research, Applications , 2011 .
[98] Marco Fabiani. Interactive computer-aided expressive music performance : Analysis, control, modification and synthesis , 2011 .
[99] Expressive Performance with Bayesian Networks and Linear Basis Models , 2011 .
[100] Roger B. Dannenberg,et al. Characterizing Tempo Change In Musical Performances , 2011, ICMC.
[101] A. Friberg,et al. An accent-based approach to performance rendering: Music theory meets music psychology , 2011 .
[102] Tadashi Kitamura,et al. Stochastic Modeling of a Musical Performance with Expressive Representations from the Musical Score , 2011, ISMIR.
[103] Alan Hanjalic,et al. Expressive Timing from Cross-Performance and Audio-based Alignment Patterns: An Extended Case Study , 2011, ISMIR.
[104] Gerhard Widmer,et al. The Magaloff Project: An Interim Report , 2010 .
[105] Miguel Molina-Solana,et al. Identifying violin performers by their expressive trends , 2010, Intell. Data Anal..
[106] Marc R. Thompson,et al. Embodied Meter: Hierarchical Eigenmodes in Music-Induced Movement , 2010 .
[107] Christopher Raphael,et al. Music Plus One and Machine Learning , 2010, ICML.
[108] Haruhiro Katayose,et al. "VirtualPhilharmony": A Conducting System with Heuristics of Conducting an Orchestra , 2010, NIME.
[109] Geraint A. Wiggins,et al. On the non-existence of music: Why music theory is a figment of the imagination , 2010 .
[110] S. Sagayama,et al. PERFORMANCE RENDERING FOR POLYPHONIC PIANO MUSIC WITH A COMBINATION OF PROBABILISTIC MODELS FOR MELODY AND HARMONY , 2010 .
[111] A. Gabrielsson,et al. The role of structure in the musical expression of emotions , 2010 .
[112] Gerhard Widmer,et al. Evidence for Pianist-specific Rubato Style in Chopin Nocturnes , 2010, ISMIR.
[113] Christopher Raphael. Symbolic and Structural Representation of Melodic Expression , 2009, ISMIR.
[114] Gerhard Widmer,et al. YQX Plays Chopin , 2009, AI Mag..
[115] Gerhard Widmer,et al. Phase-plane Representation and Visualization of Gestural Structure in Expressive Timing , 2009 .
[116] C. Palmer,et al. Synchronization of Timing and Motion 435 , 2022 .
[117] J. Wapnick,et al. Effects of Non-Musical Attributes and Excerpt Duration on Ratings of High-Level Piano Performances , 2009 .
[118] G. Widmer,et al. chapter 7 on the use of computational methods for expressive music Performance , 2009 .
[119] Lijuan Peng,et al. A Gestural Interface for Orchestral Conducting Education , 2009, CSEDU.
[120] Gerhard Widmer,et al. Who Is Who in the End? Recognizing Pianists by Their Final Ritardandi , 2009, ISMIR.
[121] Eric Cheng,et al. Quantitative Analysis of Phrasing Strategies in Expressive Performance: Computational Methods and Analysis of Performances of Unaccompanied Bach for Solo Violin , 2008 .
[122] Arshia Cont,et al. Antescofo: Anticipatory Synchronization and control of Interactive parameters in Computer Music , 2008, ICMC.
[123] Shin-ichi Maeda,et al. Gaussian Process Regression for Rendering Music Performance , 2008 .
[124] Haruhiro Katayose,et al. A New Music Database Describing Deviation Information of Performance Expressions , 2008, ISMIR.
[125] Giovanni De Poli,et al. Sense in expressive music performance: Data acquisition, computational studies, and models , 2008 .
[126] Craig Stuart Sapp. Hybrid Numeric/Rank Similarity Metrics for Musical Performance Analysis , 2008, ISMIR.
[127] Miguel Molina-Solana,et al. Using Expressive Trends for Identifying Violin P erformers , 2008, ISMIR.
[128] Ching-Hua Chuan,et al. A Dynamic Programming Approach to the Extraction of Phrase Boundaries from Tempo Variations in Expressive Performances , 2007, ISMIR.
[129] Aaron Williamon,et al. Time-Dependent Characteristics of Performance Evaluation , 2007 .
[130] Masataka Goto. Active Music Listening Interfaces Based on Signal Processing , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.
[131] Esteban Maestre,et al. Performance-Based Interpreter Identification in Saxophone Audio Recordings , 2007, IEEE Transactions on Circuits and Systems for Video Technology.
[132] Craig Stuart Sapp. Comparative Analysis of Multiple Musical Performances , 2007, ISMIR.
[133] Christopher Raphael,et al. A Simple Algorithm for Automatic Generation of Polyphonic Piano Fingerings , 2007, ISMIR.
[134] David P. Helmbold,et al. Modeling, analyzing, and synthesizing expressive piano performance with graphical models , 2006, Machine Learning.
[135] Jie Liu,et al. ESP: roadmaps as constructed interpretations and guides to expressive performance , 2006, AMCMM '06.
[136] Gerhard Widmer,et al. Relational IBL in classical music , 2006, Machine Learning.
[137] Anders Friberg,et al. pDM: An Expressive Sequencer with Real-Time Control of the KTH Music-Performance Rules , 2006, Computer Music Journal.
[138] S. Dixon,et al. PERCEPTUAL SMOOTHNESS OF TEMPO IN EXPRESSIVELY PERFORMED MUSIC , 2006 .
[139] J. Sundberg,et al. Overview of the KTH rule system for musical performance. , 2006 .
[140] Marcus T. Pearce,et al. The construction and evaluation of statistical models of melodic structure in music perception and composition , 2005 .
[141] Efstathios Stamatatos,et al. Automatic identification of music performers with learning ensembles , 2005, Artif. Intell..
[142] Jie Liu,et al. ESP: A Driving Interface for Expression Synthesis , 2005, NIME.
[143] Anders Friberg,et al. Home conducting - control the Overall Musical expression with gestures , 2005, ICMC.
[144] Gerhard Widmer,et al. The "Air Worm": an Interface for Real-Time manipulation of Expressive Music Performance , 2005, ICMC.
[145] Henkjan Honing. Timing is Tempo-Specific , 2005, ICMC.
[146] John Shawe-Taylor,et al. Using string kernels to identify famous performers from their playing style , 2004, Intell. Data Anal..
[147] Henkjan Honing,et al. Computational modeling of music cognition: a case study on model selection. , 2006 .
[148] Giovanni De Poli. Methodologies for Expressiveness Modelling of and for Music Performance , 2004 .
[149] Gerhard Widmer,et al. Computational Models of Expressive Music Performance: The State of the Art , 2004 .
[150] Haruhiro Katayose,et al. Rencon 2004: Turing Test for Musical Expression , 2004, NIME.
[151] Eero P. Simoncelli,et al. Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.
[152] Manfred Clynes,et al. Generative Principles of Musical Thought Integration of Microstructure with Structure. , 2004 .
[153] Werner Goebl,et al. Visualizing Expressive Performance in Tempo—Loudness Space , 2003, Computer Music Journal.
[154] Gerhard Widmer,et al. Playing Mozart by Analogy: Learning Multi-level Timing and Dynamics Strategies , 2003 .
[155] John Rink. In Respect of Performance: The View from Musicology , 2003 .
[156] Patrik N. Juslin,et al. Five Facets of Musical Expression: A Psychologist's Perspective on Music Performance , 2003 .
[157] A. Gabrielsson. Music Performance Research at the Millennium , 2003 .
[158] Gerhard Widmer,et al. Discovering simple rules in complex data: A meta-learning algorithm and some surprising musical discoveries , 2003, Artif. Intell..
[159] Haruhiro Katayose,et al. After the first year of Rencon , 2003, ICMC.
[160] Peter Desain,et al. Effects of Tempo on the Timing of Simple Musical Rhythms , 2002 .
[161] H. Tekman. Perceptual Integration of Timing and Intensity Variations in the Perception of Musical Accents , 2002, The Journal of general psychology.
[162] John Rink. Musical Performance: List of contributors , 2002 .
[163] John Rink,et al. Musical Performance: A Guide to Understanding , 2002 .
[164] Haruhiro Katayose,et al. RENCON: toward a new evaluation system for performance rendering systems , 2002, ICMC.
[165] Christopher Raphael,et al. Synthesizing Musical Accompaniments With Bayesian belief networks , 2001 .
[166] S. Davies. Philosophical perspectives on music's expressiveness , 2001 .
[167] P. Juslin. Communicating emotion in music performance: A review and a theoretical framework , 2001 .
[168] Anders Friberg,et al. Emotional Coloring of Computer-Controlled Music Performances , 2000, Computer Music Journal.
[169] Johan Sundberg,et al. Generating Musical Performances with Director Musices , 2000, Computer Music Journal.
[170] Gerhard Widmer,et al. Large-scale Induction of Expressive Performance Rules: First Quantitative Results , 2000, ICMC.
[171] J. Sundberg,et al. Does music performance allude to locomotion? A model of final ritardandi derived from measurements of stopping , 1999 .
[172] A. Gabrielsson. The Performance of Music , 1999 .
[173] Roberto Bresin,et al. Artificial neural networks based models for automatic performance of musical scores , 1998 .
[174] B. Repp. Obligatory “expectations” of expressive timing induced by perception of musical structure , 1998, Psychological research.
[175] C. Palmer. Music performance. , 1997, Annual review of psychology.
[176] B. Repp. The Art of Inaccuracy: Why Pianists' Errors Are Difficult to Hear , 1996 .
[177] Gerhard Widmer,et al. Learning expressive performance: The structure‐level approach , 1996 .
[178] C. Palmer. Anatomy of a Performance: Sources of Musical Expression , 1996 .
[179] Emilios Cambouropoulos,et al. Musical Rhythm: A Formal Model for Determining Local Boundaries, Accents and Metre in a Melodic Surface , 1996, Joint International Conference on Cognitive and Systematic Musicology.
[180] Gerhard Widmer,et al. Modeling the rational basis of musical expression , 1995 .
[181] John Rink. The Practice of Performance: STRUCTURE AND MEANING IN PERFORMANCE , 1995 .
[182] Peter Desain,et al. Does expressive timing in music performance scale proportionally with tempo? , 1994 .
[183] Eric Clarke,et al. Imitating and Evaluating Real and Transformed Musical Performances , 1993 .
[184] Robert Rowe,et al. Interactive Music Systems: Machine Listening and Composing , 1992 .
[185] N. Todd. The dynamics of dynamics: A model of musical expression , 1992 .
[186] Eugene Narmour,et al. The Analysis and Cognition of Basic Melodic Structures: The Implication-Realization Model , 1990 .
[187] Roger A. Kendall,et al. The Communication of Musical Expression , 1990 .
[188] David Huron,et al. The Avoidance of Inner-Voice Entries: Perceptual Evidence and Musical Practice , 1989 .
[189] John A. Sloboda,et al. The performance of music , 1986 .
[190] H. C. Longuet-Higgins,et al. The Rhythmic Interpretation of Monophonic Music , 1984 .
[191] Roger B. Dannenberg,et al. An On-Line Algorithm for Real-Time Accompaniment , 1984, ICMC.
[192] R. Jackendoff,et al. A Generative Theory of Tonal Music , 1985 .
[193] Johan Sundberg,et al. Musical Performance: A Synthesis-by-Rule Approach , 1983 .
[194] H C Longuet-Higgins,et al. The Perception of Musical Rhythms , 1982, Perception.
[195] R. Rasch,et al. The perceptual onset of musical tones , 1981, Perception & psychophysics.
[196] J. Russell. A circumplex model of affect. , 1980 .
[197] Alf Gabrielsson,et al. Performance of rhythm patterns , 1974 .
[198] Manfred Clynes,et al. Toward a Theory of Man: Precision of Essentic form in Living Communication , 1969 .
[199] Hilla Peretz,et al. Ju n 20 03 Schrödinger ’ s Cat : The rules of engagement , 2003 .
[200] Robert O. Gjerdingen,et al. The Psychology of Music , 1972 .
[201] Alfred Binet,et al. Recherches graphiques sur la musique , 1895 .