Dance2Music: Automatic Dance-driven Music Generation

Dance and music typically go hand in hand. The complexities in dance, music, and their synchronisation make them fascinating to study from a computational creativity perspective. While several works have looked at generating dance for a given music, automatically generating music for a given dance remains underexplored. This capability could have several creative expression and entertainment applications. We present some early explorations in this direction. We present a search-based offline approach that generates music after processing the entire dance video and an online approach that uses a deep neural network to generate music on-the-fly as the video proceeds. We compare these approaches to a strong heuristic baseline via human studies and present our findings. We have integrated our online approach in a live demo! A video of the demo can be found here: https://sites.google. com/view/dance2music/live-demo.

[1]  Roger B. Dannenberg,et al.  SICIB: An Interactive Music Composition System Using Body Movements , 2001, Computer Music Journal.

[2]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[3]  J. M. Hughes,et al.  Visual and auditory brain areas share a representational structure that supports emotion perception , 2018, Current Biology.

[4]  Minho Lee,et al.  Music similarity-based approach to generating dance motion sequence , 2012, Multimedia Tools and Applications.

[5]  Ilya Sutskever,et al.  Jukebox: A Generative Model for Music , 2020, ArXiv.

[6]  Omkar Vaidya,et al.  Hand Gesture Based Music Player Control in Vehicle , 2019, 2019 IEEE 5th International Conference for Convergence in Technology (I2CT).

[7]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Satoru Fukayama,et al.  AIST Dance Video Database: Multi-Genre, Multi-Dancer, and Multi-Camera Database for Dance Information Processing , 2019, ISMIR.

[9]  Ruozi Huang,et al.  Dance Revolution: Long-Term Dance Generation with Music via Curriculum Learning , 2021, ICLR.

[10]  Hafiz Adnan Habib,et al.  Infotainment devices control by eye gaze and gesture recognition fusion , 2008, IEEE Transactions on Consumer Electronics.

[11]  Chuang Gan,et al.  Generating Visually Aligned Sound From Videos , 2020, IEEE Transactions on Image Processing.

[12]  Alexei A. Efros,et al.  Everybody Dance Now , 2018, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[13]  Weidong Geng,et al.  Example-Based Automatic Music-Driven Conventional Dance Motion Synthesis , 2012, IEEE Transactions on Visualization and Computer Graphics.

[14]  Chuang Gan,et al.  Foley Music: Learning to Generate Music from Videos , 2020, ECCV.

[15]  Abhishek Das,et al.  Feel The Music: Automatically Generating A Dance For An Input Song , 2020, ICCC.