If motion sounds: Movement sonification based on inertial sensor data

Within last years, movement sonification turned out to be an appropriate support for motor perception and motor control that can display physical motion in a very rich and direct way. But how should movement sonification be configured to support motor learning? The appropriate selection of movement parameters and their transformation into characteristic motion features is essential for an auditory display to become effective. In this paper, we introduce a real-time sonification framework for all common MIDI environments based on acceleration and orientation data from inertial sensors. Fundamental processing steps to transform motion information into meaningful sound will be discussed. The proposed framework of inertial motion capturing, kinematic parameter selection and possible kinematic acoustic mapping provides a basis for mobile real-time movement sonification which is a prospective powerful training tool for rehabilitation and sports and offers a broad variety of application possibilities.

[1]  M. Graziano,et al.  Complex Movements Evoked by Microstimulation of Precentral Cortex , 2002, Neuron.

[2]  Eric L. Schwartz,et al.  Computational Neuroscience , 1993, Neuromethods.

[3]  N. Troje,et al.  Person identification from biological motion: Effects of structural and kinematic cues , 2005, Perception & psychophysics.

[4]  Alfred Effenberg,et al.  Movement sonification: Effects on perception and action , 2005, IEEE MultiMedia.

[5]  B.L. Yen,et al.  Determining 3-D motion and structure of a rigid body using the spherical projection , 1983, Comput. Vis. Graph. Image Process..

[6]  Lukas Scheef,et al.  Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events , 2009, Brain Research.

[7]  Interactive sonification of German wheel sports , 2010 .

[8]  Huosheng Hu,et al.  Inertial motion tracking of human arm movements in stroke rehabilitation , 2005, IEEE International Conference Mechatronics and Automation, 2005.

[9]  Jessica Hummel,et al.  Interactive sonification of German wheel sports movements , 2009 .

[10]  Christoph Henkelmann,et al.  Improving the Aesthetic Quality of Realtime Motion Data Sonification , 2007 .

[11]  Joseph Rothstein,et al.  MIDI: A Comprehensive Introduction , 1992 .

[12]  Roberto Bresin,et al.  Sonification of Physical Quantities Throughout History: A Meta-Study of Previous Mapping Strategies , 2011, icad 2011.

[13]  S. Schaal The Computational Neurobiology of Reaching and Pointing — A Foundation for Motor Learning by Reza Shadmehr and Steven P. Wise , 2007 .

[14]  O. Bock,et al.  Adaptation of eye and hand movements to target displacements of different size , 2010, Experimental Brain Research.

[15]  B. Stein,et al.  The Merging of the Senses , 1993 .