A Protocol for Audiovisual Cutting
暂无分享,去创建一个
We explore the extension of an algorithmic composition system for live audio cutting to the realm of video, through a protocol for message passing between separate audio and video applications. The protocol enables fruitful musician to video artist collaboration with multiple new applications in live performance: The crowd at a gig can be cutup as video in synchrony with audio cutting, a musician can be filmed live and both the footage and output audio stream segmented locked together. More abstract mappings are perfectly possible, but we emphasise the ability to reveal the nature of underlying audio cutting algorithms that would otherwise remain concealed from an audience. There are parallel MIDI and OSC realtime implementations and text file generation for nonrealtime rendering. A propitious side effect of the protocol is that capabilities in audio cutting can be cheaply brought to bear for video processing.
[1] Nicholas Cook,et al. Analysing Musical Multimedia , 1998 .
[2] Matthew Wright,et al. Open SoundControl: A New Protocol for Communicating with Sound Synthesizers , 1997, ICMC.
[3] James McCartney. Continued Evolution of the SuperCollider Real Time Synthesis Environment , 1998, ICMC.
[4] Nick Collins. The BBCut Library , 2002, ICMC.