Implementation of god-like interaction techniques for supporting collaboration between outdoor AR and indoor tabletop users

This paper presents a new interaction metaphor we have termed "god-like interaction". This is a metaphor for improved communication of situational and navigational information between outdoor users, equipped with mobile augmented reality systems, and indoor users, equipped with tabletop projector display systems. Physical objects are captured by a series of cameras viewing a table surface indoors, the data is sent over a wireless network, and is then reconstructed at a real-world location for outdoor augmented reality users. Our novel god-like interaction metaphor allows users to communicate information using physical props as well as natural gestures. We have constructed a system that implements our god-like interaction metaphor as well as a series of novel applications to facilitate collaboration between indoor and outdoor users. We have extended a well-known video based rendering algorithm to make it suitable for use on outdoor wireless networks of limited bandwidth. This paper also describes the limitations and lessons learned during the design and construction of the hardware that supports this research.

[1]  Hideyuki Nakanishi,et al.  Transcendent communication: location-based guidance for large-scale public spaces , 2004, CHI.

[2]  Ivo Ihrke,et al.  A mobile system for multi-video recording , 2004 .

[3]  Ivan Poupyrev,et al.  The MagicBook - Moving Seamlessly between Reality and Virtuality , 2001, IEEE Computer Graphics and Applications.

[4]  Luc Van Gool,et al.  Blue-c: a spatially immersive display and 3D video portal for telepresence , 2003, IPT/EGVE.

[5]  Hiroshi Ishii,et al.  Tangible Bits: Coupling Physicality and Virtuality Through Tangible User Interfaces , 1999 .

[6]  Wolfgang Broll,et al.  The virtual round table - a collaborative augmented multi-user environment , 2000, CVE '00.

[7]  William Ribarsky,et al.  Evaluation of a multimodal interface for 3D terrain visualization , 2002, IEEE Visualization, 2002. VIS 2002..

[8]  Mark Billinghurst,et al.  Evaluation of mixed-space collaboration , 2005, Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05).

[9]  Gerd Kortuem,et al.  A collaborative wearable system with remote sensing , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[10]  Ramesh Raskar,et al.  Image-based visual hulls , 2000, SIGGRAPH.

[11]  Philip R. Cohen,et al.  Something from nothing: augmenting a paper-based work practice via multimodal interaction , 2000, DARE '00.

[12]  T. Ichikawa,et al.  Egocentric Object Manipulation in Virtual Environments: Empirical Evaluation of Interaction Techniques , 1998, Comput. Graph. Forum.

[13]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[14]  Wojciech Matusik,et al.  Polyhedral Visual Hulls for Real-Time Rendering , 2001, Rendering Techniques.

[15]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[16]  Bruce H. Thomas,et al.  Interactive augmented reality techniques for construction at a distance of 3D geometry , 2003 .

[17]  William Ribarsky,et al.  Speech and Gesture Multimodal Control of a Whole Earth 3D Visualization Environment , 2002, VisSym.

[18]  Matthew Chalmers,et al.  Lessons from the lighthouse: collaboration in a shared mixed reality system , 2003, CHI '03.

[19]  Hirokazu Kato,et al.  3D live: real time captured content for mixed reality , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[20]  Roland Wagner,et al.  A combined immersive and desktop authoring tool for virtual environments , 2002, Proceedings IEEE Virtual Reality 2002.

[21]  B. Leibe,et al.  Toward Spontaneous Interaction with the Perceptive Workbench 1 , 2000 .

[22]  Andrew E. Johnson,et al.  Supporting transcontinental collaborative work in persistent virtual environments , 1996, IEEE Computer Graphics and Applications.

[23]  William Ribarsky,et al.  Toward Spontaneous Interaction with the Perceptive Workbench , 2000, IEEE Computer Graphics and Applications.

[24]  Steven K. Feiner,et al.  Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system , 1999, Comput. Graph..

[25]  Hideaki Kuzuoka,et al.  Spatial workspace collaboration: a SharedView video support system for remote collaboration capability , 1992, CHI.

[26]  Philip R. Cohen,et al.  A visual modality for the augmentation of paper , 2001, PUI '01.

[27]  Benjamin Lok,et al.  Online model reconstruction for interactive virtual environments , 2001, I3D '01.

[28]  Hideaki Kuzuoka,et al.  Tangible Tabletop Interface for an Expert to Collaborate with Remote Field Workers , 2005 .

[29]  Adrian Bullock,et al.  H.5.3 [Information Interfaces and Presentation (e.g. HCI)]: , 2022 .

[30]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[31]  Yizhou Yu,et al.  Efficient View-Dependent Image-Based Rendering with Projective Texture-Mapping , 1998, Rendering Techniques.

[32]  A. Laurentini,et al.  The Visual Hull Concept for Silhouette-Based Image Understanding , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[33]  Bruce H. Thomas,et al.  An object-oriented software architecture for 3D mixed reality applications , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[34]  George W. Furnas,et al.  Social interactions in multiscale CVEs , 2002, CVE '02.

[35]  D. Schmalstieg,et al.  Distributed applications for collaborative augmented reality , 2002, Proceedings IEEE Virtual Reality 2002.

[36]  Hans-Peter Seidel,et al.  Hardware-Accelerated Visual Hull Reconstruction and Rendering , 2003, Graphics Interface.