3D User Interfaces for Collaborative Work

Desktop environments have proven to be a powerful user interface and are used as the de facto standard human-computer interaction paradigm for over 20 years. However, there is a rising demand on 3D applications dealing with complex datasets, which exceeds the possibilities provided by traditional devices or two-dimensional display. For these domains more immersive and intuitive interfaces are required. But in order to get the users’ acceptance, technology-driven solutions that require inconvenient instrumentation, e.g., stereo glasses or tracked gloves, should be avoided. Autostereoscopic display environments equipped with tracking systems enable users to experience 3D virtual environments more natural without annoying devices, for instance via gestures. However, currently these approaches are only applied for specially designed or adapted applications without universal usability. Although these systems provide enough space to support multi-user, additional costs and inconvenient instrumentation hinder acceptance of these user interfaces. In this chapter we introduce new collaborative 3D user interface concepts for such setups where minimal instrumentation of the user is required such that the strategies can be easily integrated in everyday working environments. Therefore, we propose an interaction system and framework, which allows displaying and interacting with both monoas well as stereoscopic content in parallel. Furthermore, the setup enables multiple users to view the same data simultaneously. The challenges for combined mouse-, keyboardand gesturebased input paradigms in such an environment are pointed out and novel interaction strategies are introduced.

[1]  Przemyslaw Kozankiewicz Fast Algorithm for Creating Image-Based Stereo Images , 2002, WSCG.

[2]  P. David Stotts,et al.  FaceSpace: endo- and exo-spatial hypermedia in the transparent video facetop , 2004, HYPERTEXT '04.

[3]  Neil A. Dodgson,et al.  Autostereoscopic 3D displays , 2005, Computer.

[4]  Joe Project Looking Glass , 2004 .

[5]  Daniel C. Robbins,et al.  Three-dimensional widgets , 1992, I3D '92.

[6]  Wolfgang Stuerzlinger,et al.  Unconstrained vs. Constrained 3D Scene Manipulation , 2001, EHCI.

[7]  Andrew E. Johnson,et al.  Dynallax: Solid State Dynamic Parallax Barrier Autostereoscopic VR Display , 2007, 2007 IEEE Virtual Reality Conference.

[8]  Timo Ropinski,et al.  Simultaneously viewing monoscopic and stereoscopic content on vertical-interlaced autostereoscopic displays , 2006, SIGGRAPH '06.

[9]  Jin Liu,et al.  Three-dimensional PC: toward novel forms of human-computer interaction , 2001, Optics East.

[10]  Henrik Tramberend A display device abstraction for virtual reality applications , 2001, AFRIGRAPH '01.

[11]  Timo Ropinski,et al.  Towards Applicable 3D User Interfaces for Everyday Working Environments , 2007, INTERACT.

[12]  G. Smith,et al.  3D Scene Manipulation with Constraints , 2001 .

[13]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[14]  Eric A. Wernert,et al.  Constrained 3D navigation with 2D controllers , 1997, Proceedings. Visualization '97 (Cat. No. 97CB36155).

[15]  Ravin Balakrishnan,et al.  Keepin' it real: pushing the desktop metaphor with physics, piles and the pen , 2006, CHI.

[16]  Alexander A. Sawchuk,et al.  Three-dimensional interaction with autostereoscopic displays , 2004, IS&T/SPIE Electronic Imaging.

[17]  Mary Czerwinski,et al.  The Task Gallery: a 3D window manager , 2000, CHI.

[18]  Steven K. Feiner,et al.  Cross-dimensional gestural interaction techniques for hybrid immersive environments , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[19]  Robert van Liere,et al.  Enhancing fish tank VR , 2000, Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).