Exploration of the correspondence between visual and acoustic parameter spaces

This paper describes an approach to match visual and acoustic parameters to produce an animated musical expression. Music may be generated to correspond to animation, as described here; imagery may be created to correspond to music; or both may be developed simultaneously. This approach is intended to provide new tools to facilitate both collaboration between visual artists and musicians and examination of perceptual issues between visual and acoustic media. As a proof-of-concept, a complete example is developed with linear fractals as a basis for the animation, and arranged rhythmic loops for the music. Since both visual and acoustic elements in the example are generated from concise specifications, the potential of this approach to create new works through parameter space exploration is accentuated, however, there are opportunities for application to a wide variety of source material. These additional applications are also discussed, along with issues encountered in development of the example.