This paper describes an approach to match visual and acoustic parameters to produce an animated musical expression. Music may be generated to correspond to animation, as described here; imagery may be created to correspond to music; or both may be developed simultaneously. This approach is intended to provide new tools to facilitate both collaboration between visual artists and musicians and examination of perceptual issues between visual and acoustic media. As a proof-of-concept, a complete example is developed with linear fractals as a basis for the animation, and arranged rhythmic loops for the music. Since both visual and acoustic elements in the example are generated from concise specifications, the potential of this approach to create new works through parameter space exploration is accentuated, however, there are opportunities for application to a wide variety of source material. These additional applications are also discussed, along with issues encountered in development of the example.
[1]
Michael Good,et al.
MusicXML for notation and analysis
,
2001
.
[2]
John Hart,et al.
Linear Fractal Shape Interpolation
,
1997,
Graphics Interface.
[3]
Alvy Ray Smith,et al.
Plants, fractals, and formal languages
,
1984,
SIGGRAPH.
[4]
Michael J. Lyons,et al.
Proceedings of the 2004 conference on New interfaces for musical expression
,
2004
.
[5]
Gregory Kramer,et al.
Sonification and the interaction of perceptual dimensions: Can the data get lost in the map?
,
2000
.
[6]
Russell L. Storms,et al.
Auditory-Visual Cross-Modal Perception
,
1998
.
[7]
Robert D. Russell,et al.
A new paradigm for exploration in computer-aided visualization
,
1999
.
[8]
J. Durlak.
The Language of New Media
,
2002
.
[9]
Daryl H. Hepting.
INTERACTIVE EVOLUTION FOR SYSTEMATIC EXPLORATION OF A PARAMETER SPACE
,
2003
.