EQUIP: a Software Platform for Distributed Interactive Systems

EQUIP is a new software platform designed and engineered to support the development and deployment of distributed interactive systems, such as mixed reality user interfaces that combine distributed input and output devices to create a coordinated experience. EQUIP emphasises: cross-language development (currently C++ and Java), modularisation, extensibility, interactive performance, and heterogeneity of devices (from handheld devices to large servers and visualisation machines) and networks (including both wired and wireless technologies). A key element of EQUIP is its shared data service, which combines ideas from tuplespaces, general event systems and collaborative virtual environments. This data service provides a uniquely balanced treatment of state and event-based communication. It also supports distributed computation – through remote class loading – as well as passive data distribution. EQUIP has already been used in several projects within the EQUATOR Interdisciplinary Research Collaboration (IRC) in the UK, and is freely available in source form (currently known to work on Windows, IRIX and MacOS-X platforms). INTRODUCTION The development of novel interactive devices and the deployment of mobile communication infrastructures have fuelled a growing focus on ubiquitous interactive systems that support people within real world environments. These systems place digital information in physical spaces [28] focusing on the delivery of information to users through a heterogeneous collection of devices ranging from handheld and wearable computers to large embedded displays. The majority of these systems have exploited a sense of location as a contextual cue to drive the interaction. An equally significant trend has been the growth in the number and diversity of collaborative virtual environments to manage cooperative interaction [2, 12, 26]. Just as ubiquitous computing environments exploit real world location, these systems exploit a sense of location within a virtual world as a contextual cue for interaction. However, despite significant similarities, these two research approaches have often tended to be seen in opposition to each other, with ubiquitous computing embedding computers with the world of users, and virtual environments embedding users within a computer generated world [16]. As part of our ongoing research we are exploring the advantages to be gained through the convergence of these approaches, allowing a collaborative virtual environment to be overlaid on top of a shared physical space. A number of key advantages motivate our desire to combine the physical and virtual to support interactive systems: The ability to exploit the coextensive virtual world as a ‘behind the scenes’ resource for coordinating and managing devices and interaction in the physical space. The opportunity to develop applications that span the physical and digital realms, for example that require collaboration between field operatives and controlroom personnel. The chance to support new kinds of interactive experience, combining elements from virtual worlds (e.g. rich media content, high interactivity) with varied modes of access over extended geographical areas and periods of time (e.g. across a city, over a period of days or weeks). Our ultimate goal is to develop a rich interactive experience that combines physical and digital space, with digital interaction becoming increasingly interwoven with everyday interaction in the physical world. This paper presents the EQUIP platform [9], developed to support the merging of physical and virtual environments as part of the EQUATOR Interdisciplinary Research Collaboration (IRC) in the UK [8]. EQUIP is freely available (including source) for other practitioners to make use of [9]. The rest of this paper gives an overview of EQUIP, and its key elements before

[1]  Keith Cheverst,et al.  The Role of Connectivity in Supporting Context-Sensitive Applications , 1999, HUC.

[2]  David Gelernter,et al.  Generative communication in Linda , 1985, TOPL.

[3]  Michael Zyda,et al.  NPSNET: Flight Simulation Dynamic Modeling Using Quaternions , 1992, Presence: Teleoperators & Virtual Environments.

[4]  Gordon S. Blair,et al.  Limbo: a tuple space based platform for adaptive mobile applications , 1997 .

[5]  Steve Benford,et al.  The augurscope: a mixed reality interface for outdoors , 2002, CHI.

[6]  Bill N. Schilit,et al.  An overview of the PARCTAB ubiquitous computing experiment , 1995, IEEE Wirel. Commun..

[7]  Ian Taylor,et al.  Temporal links: recording and replaying virtual environments , 2000, ACM Multimedia.

[8]  Steve Pettifer,et al.  GNU/MAVERIK: a micro-kernel for large-scale virtual environments , 1999, VRST '99.

[9]  Gregory D. Abowd,et al.  Providing architectural support for building context-aware applications , 2000 .

[10]  Ian Taylor,et al.  Unearthing Virtual History: Using Diverse Interfaces to Reveal Hidden Virtual Worlds , 2001, UbiComp.

[11]  Ian Taylor,et al.  Shared visiting in EQUATOR city , 2002, CVE '02.

[12]  Chris Greenhalgh,et al.  Inside MASSIVE-3: flexible support for data consistency and world structuring , 2000, CVE '00.

[13]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[14]  Norbert A. Streitz,et al.  i-LAND: an interactive landscape for creativity and innovation , 1999, CHI '99.

[15]  Richard C. Waters,et al.  Locales: supporting large multiuser virtual environments , 1996, IEEE Computer Graphics and Applications.

[16]  Steve Benford,et al.  Mixed-Reality Interfaces to Immersive Projection Systems , 2002 .

[17]  Yvonne Rogers,et al.  Things aren't what they seem to be: innovation through technology inspiration , 2002, DIS '02.

[18]  Michael Zyda,et al.  Bamboo-a portable system for dynamically extensible, real-time, networked, virtual environments , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).