[Invited Paper] Ambient Music Co-player: Generating Affective Video in Response to Impromptu Music Performance

When a musical instrument player performs music, the accompanying visual information can have a significant effect on the performance. In this paper, we present an ambient music co-player (AMP) as a system that generates background videos in response to the impromptu performance of a single musical instrument player. The AMP system evaluates the performance to interpret the real player’s emotional impression and generates an influential video based on the results of the evaluation. The player tends to change their performance while being inspired by the generated video, further triggering the system to modify the video. The AMP system aims to establish an affective loop where the system continues applying stimuli to the performance of the real player. The final goal of this study is to make the system act as a “co-player” of the player and to amplify the quality of the player’s performing experience entirely through interactions between the two. By conducting a user evaluation, it was proven that the AMP system was able to inspire an amateur guitarist as the subject through affective video generation and to make his performance better than when playing alone.

[1]  Anton Nijholt,et al.  Temporal interaction between an artificial orchestra conductor and human musicians , 2008, CIE.

[2]  J. Russell A circumplex model of affect. , 1980 .

[3]  Xiaoyang Mao,et al.  Affective rendering: Visual effect animations for affecting user arousal , 2012, 2012 International Conference on Multimedia Computing and Systems.

[4]  P. Juslin,et al.  Cue Utilization in Communication of Emotion in Music Performance: Relating Performance to Perception Studies of Music Performance , 2022 .

[5]  Mohan S. Kankanhalli,et al.  Video retargeting for aesthetic enhancement , 2010, ACM Multimedia.

[6]  Jan Berg,et al.  Relations between selected musical parameters and expressed emotions: extending the potential of computer entertainment , 2005, ACE '05.

[7]  W. J. Freeman,et al.  Alan Turing: The Chemical Basis of Morphogenesis , 1986 .

[8]  Katja Rogers,et al.  P.I.A.N.O.: Faster Piano Learning with Interactive Projection , 2014, ITS '14.

[9]  Yi-Hsuan Yang,et al.  Machine Recognition of Music Emotion: A Review , 2012, TIST.

[10]  Stephen K. Scott,et al.  Autocatalytic reactions in the isothermal, continuous stirred tank reactor: Isolas and other forms of multistability , 1983 .

[11]  Issei Fujishiro,et al.  An Affective Video Generation System Supporting Impromptu Musical Performance , 2016, 2016 International Conference on Cyberworlds (CW).

[12]  Mohan S. Kankanhalli,et al.  Pivot Vector Space Approach for Audio-Video Mixing , 2003, IEEE Multim..

[13]  Andrew R. Brown,et al.  Visualizing digital media interactions: providing feedback on jam2jam AV performances , 2010, OZCHI '10.

[14]  S. Dixon ONSET DETECTION REVISITED , 2006 .

[15]  Lie Lu,et al.  Automatic mood detection and tracking of music audio signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[16]  Thomas G. Dietterich,et al.  Solving Multiclass Learning Problems via Error-Correcting Output Codes , 1994, J. Artif. Intell. Res..

[17]  Mohan S. Kankanhalli,et al.  Affect-based adaptive presentation of home videos , 2011, ACM Multimedia.

[18]  A. M. Turing,et al.  The chemical basis of morphogenesis , 1952, Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences.

[19]  Ling-Yu Duan,et al.  Hierarchical movie affective content analysis based on arousal and valence features , 2008, ACM Multimedia.

[20]  Jeen-Shing Wang,et al.  A Music Emotion Recognition Algorithm with Hierarchical SVM Based Classifiers , 2014, 2014 International Symposium on Computer, Consumer and Control.

[21]  Stephen K. Scott,et al.  Autocatalytic reactions in the isothermal, continuous stirred tank reactor: Oscillations and instabilities in the system A + 2B → 3B; B → C , 1984 .