Synchronization in Multimedia Languages for Distributed Systems

The rising popularity of multimedia content on the web has led to the development of special-purpose languages for multimedia authoring and presentations. Examples of such languages include SMIL [1], VRML [2], and MPEG4 [3]. These languages support the description of a multimedia presentation containing multiple media sources including both natural and synthetic media as well as media stored in files or streamed live over the network. Some mechanism for specifying the layout of the media on the screen is given as well as a set of primitives for synchronizing the various elements of the presentation. For example, in SMIL we can specify that two video clips be displayed in parallel or that one audio clip be started when another clip finishes. Some of these languages allow for a limited amount of user interactions. A SMIL 2.0 presentation might allow a user to choose a soundtrack in one of several different languages by clicking on a particular area of the presentation. This is accomplished through the incorporation of the events defined in a scripting language such as JavaScript. While these are well suited for the description of multimedia presentations on the Web, they are of limited use for creating more general distributed multimedia applications since general-purpose programming is only available in the form of scripting languages that have limited power. To support the construction of more largescale applications approaches such as the use of special multimedia libraries along with a general-purpose language as in the case of Java and JMF [4] or the extension of middleware such as Corba [5] have been proposed. Besides lacking certain essential characteristics for the development of advanced distributed multimedia applications that will be noted below, the use of libraries and/or middleware to achieve synchronization and perform other media related services results in a less wellspecified approach than can be achieved by directly extending existing general purpose languages with multimedia constructs with precisely specified semantics. This latter is the approach we follow in our work on multimedia languages that we will describe here. The language that we want to design should support general-purpose computation; therefore the multimedia constructs whose semantics we will describe should be added to an existing general purpose language such as C, C++ or Java. This is the approach taken by the reactive language Esterel [6]. Reactivity is a very important property for a multimedia language. A reactive system is one which responds to stimuli from the environment [7]. In a multimedia system such stimuli might include user interactions as well as (for example) events generated by the contents of some media stream. The multimedia system must be able to interact with the environment within a short, predefined time period. When used in this context, the difference between reactive systems and interactive systems is that while both may interact with the environment, the latter do not have such a time constraint. The way in which the environment interacts with the multimedia system is through the generation of signals. A signal can be either synchronous (e.g. the reading of some sensing device) or asynchronous (e.g. the recognition of a particular face in a video stream). Our approach to multimedia languages greatly increases the power and flexibility of synchronization by providing synchronization constructs which can be applied not just between media streams, but also between media streams and (synchronous or asynchronous) signals. In fact, in our approach multimedia streams are just a particular type of signal. The rest of this paper is organized as follows. Section 2 describes the fundamental concepts of signals and streams. Section 3 introduces the various synchronization mechanisms. Section 4 describes the language constructs. Section 5 discusses related research while section 6 describes future research.

[1]  Rob Gordon,et al.  Essential JMF - Java Media Framework , 1998 .

[2]  Hari Kalva,et al.  MPEG-4 systems and applications , 1999, MULTIMEDIA '99.

[3]  Davide Sangiorgi,et al.  Communicating and Mobile Systems: the π-calculus, , 2000 .

[4]  Robin Milner,et al.  Communication and concurrency , 1989, PHI Series in computer science.

[5]  Cham Athwal,et al.  Synchronised multimedia for engineering and scientific analysis , 2003, Multimedia Systems.

[6]  A. Bansal,et al.  TANDEM – Transmitting Asynchronous Non Deterministic and Deterministic Events in Multimedia Systems over the Internet , 2004 .

[7]  Gérard Berry,et al.  The foundations of Esterel , 2000, Proof, Language, and Interaction.

[8]  Robin Milner,et al.  Communicating and mobile systems - the Pi-calculus , 1999 .

[9]  Leslie Lamport,et al.  Time, clocks, and the ordering of events in a distributed system , 1978, CACM.

[10]  Klara Nahrstedt,et al.  An XML-based Quality of Service Enabling Language for the Web , 2002, J. Vis. Lang. Comput..

[11]  Gérard Berry,et al.  The Esterel Synchronous Programming Language: Design, Semantics, Implementation , 1992, Sci. Comput. Program..

[12]  Thomas A. Henzinger,et al.  An interleaving model for real-time , 1990, Proceedings of the 5th Jerusalem Conference on Information Technology, 1990. 'Next Decade in Information Technology'.

[13]  Wang Yi,et al.  CCS + Time = An Interleaving Model for Real Time Systems , 1991, ICALP.

[14]  N. D. Georganas,et al.  Guest Editorial Synchronization Issues in Multimedia Communications , 1996 .

[15]  Robin Milner,et al.  A Calculus of Communicating Systems , 1980, Lecture Notes in Computer Science.

[16]  Mitsuru Ishizuka,et al.  Journal of Visual Languages , 2002 .

[17]  Ahmed Karmouch,et al.  Multimedia teleorchestra with independent sources: Part 1 — temporal modeling of collaborative multimedia scenarios , 2005, Multimedia Systems.

[18]  Simon J. Thompson,et al.  Modeling Reactive Multimedia: Events and Behaviors , 2004, Multimedia Tools and Applications.