Interactive Music: Human Motion Initiated Music Generation Using Skeletal Tracking By Kinect

This work experiments with human motion initiated music generation. Here we present a stand-alone system to tag human motions readily into musical notes. We do this by first discovering the human skeleton using depth images acquired by infrared range sensors and then exploiting the resultant skeletal tracking. This realtime skeletal tracking is done using the videogame console Microsoft KinectTM for Xbox 360. An agent’s bodily motion is defined by the spatial and temporal arrangement of his skeletal framework over the episode of the associated move. After extracting the skeleton of a performing agent by interfacing the Kinect with an intermediate computer application, various features defining the agent’s motion are computed. Features like velocity, acceleration and change in position of the agent’s body parts is then used to generate musical notes. Finally, as a participating agent performs a set of movements in front of our system, the system generates musical notes that are continually regulated by the defined features describing his motion.