An Audio-Driven System For Real-Time Music Visualisation

Computer-generated visualisations can accompany recorded or live music to create novel audiovisual experiences for audiences. We present a system to streamline the creation of audio-driven visualisations based on audio feature extraction and mapping interfaces. Its architecture is based on three modular software components: backend (audio plugin), frontend (3D game-like environment), and middleware (visual mapping interface). We conducted a user evaluation comprising two stages. Results from the first stage (34 participants) indicate that music visualisations generated with the system were significantly better at complementing the music than a baseline visualisation. Nine participants took part in the second stage involving interactive tasks. Overall, the system yielded a Creativity Support Index above average (68.1) and a System Usability Scale index (58.6) suggesting that ease of use can be improved. Thematic analysis revealed that participants enjoyed the system’s synchronicity and expressive capabilities, but found technical problems and difficulties understanding the audio feature terminology.

[1]  Robyn Taylor robyn,et al.  REAL-TIME MUSIC VISUALIZATION USING RESPONSIVE IMAGERY , 2007 .

[2]  Celine Latulipe,et al.  The creativity support index , 2009, CHI Extended Abstracts.

[3]  Joshua D. Reiss,et al.  An Evaluation of Audio Feature Extraction Toolboxes , 2015 .

[4]  Atau Tanaka,et al.  The Role of Live Visuals in Audience Understanding of Electronic Music Performances , 2017, Audio Mostly Conference.

[5]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[6]  F. Wilcoxon Individual Comparisons by Ranking Methods , 1945 .

[7]  Marcelo M. Wanderley,et al.  Libmapper: (a library for connecting things) , 2013, CHI Extended Abstracts.

[8]  Matthew Neil Bain,et al.  Real Time Music Visualization: A Study in the Visual Extension of Music , 2008 .

[9]  Hideki Kawahara,et al.  YIN, a fundamental frequency estimator for speech and music. , 2002, The Journal of the Acoustical Society of America.

[10]  Nick Bryan-Kinns,et al.  FEATUR.UX: An approach to leveraging multitrack information for artistic music visualization , 2016 .

[11]  Geoff Wyvill,et al.  Visualization of musical pitch , 2003, Proceedings Computer Graphics International 2003.

[12]  Paul Masri,et al.  Imroved Modelling of Attack Transients in Music Analysis-Resynthesis , 1996, ICMC.

[13]  Daniel Torres,et al.  The ANIMUS project: a framework for the creation of interactive creatures in immersed environments , 2003, VRST '03.

[14]  R. Kronland-Martinet,et al.  Acoustical Correlates of Timbre and Expressiveness in Clarinet Performance , 2010 .

[15]  Marcelo M. Wanderley,et al.  The Importance of Parameter Mapping in Electronic Instrument Design , 2002, NIME.

[16]  P. Gomez,et al.  Music and felt emotions: How systematic pitch level variations affect the experience of pleasantness and arousal , 2014 .

[17]  M. Friedman The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance , 1937 .

[18]  V. Braun,et al.  Using thematic analysis in psychology , 2006 .

[19]  W. Andrew Schloss,et al.  Using Contemporary Technology in Live Performance: The Dilemma of the Performer , 2003 .

[20]  S.H. Ong,et al.  Towards building an experiential music visualizer , 2007, 2007 6th International Conference on Information, Communications & Signal Processing.

[21]  Xavier Serra,et al.  ESSENTIA: an open-source library for sound and music analysis , 2013, ACM Multimedia.

[22]  Ondřej Kubelka xkubelka Interactive music visualization , 2000 .