Principles, interactions and devices for real-world immersive modeling

Abstract Building a real-world immersive 3D modeling application is hard. In spite of the many supposed advantages of working in the virtual world, users quickly tire of waving their arms about and the resulting models remain simplistic at best. The dream of creation at the speed of thought has largely remained unfulfilled due to numerous factors such as the lack of suitable menu and system controls, inability to perform precise manipulations, lack of numeric input, challenges with ergonomics, and difficulties with maintaining user focus and preserving immersion. The focus of our research is on the building of virtual world applications that can go beyond the demo and can be used to do real-world work. The goal is to develop interaction techniques that support the richness and complexity required to build complex 3D models, yet minimize expenditure of user energy and maximize user comfort. We present an approach that combines the natural and intuitive power of virtual reality (VR) interaction, the precision and control of 2D touch surfaces, and the richness of a commercial modeling package. We discuss the benefits of collocating 2D touch with 3D bimanual spatial input, the challenges in designing a custom controller targeted at achieving the same, and the new avenues that this collocation creates. We describe our Touch Where You Can technique that adapts the user interface to support a wide array of hand sizes, minimizing the ergonomic impact on the user. Finally, we demonstrate new interface designs that are better suited for the thumbs-only touch interactions favored by our system.

[1]  Fotis Sotiropoulos,et al.  Interactive Slice WIM: Navigating and Interrogating Volume Data Sets Using a Multisurface, Multitouch VR Interface , 2012, IEEE Transactions on Visualization and Computer Graphics.

[2]  Simon J. Julier,et al.  Design and implementation of an immersive virtual reality system based on a smartphone platform , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[3]  Jason Jerald,et al.  MakeVR: A 3D world-building interface , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[4]  Marc Olano,et al.  3DM: a three dimensional modeler using a head-mounted display , 1992, I3D '92.

[5]  Sriram Subramanian,et al.  Empirical Evaluation of Performance in Hybrid 3D and 2D Interfaces , 2003, INTERACT.

[6]  Torsten Kuhlen,et al.  An evaluation of a smart-phone-based menu system for immersive virtual environments , 2014, 2014 IEEE Symposium on 3D User Interfaces (3DUI).

[7]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[8]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[9]  Antonio Krüger,et al.  Multi-Touching Cross-Dimensional Data: Towards Direct Interaction in Stereoscopic Display Environments coupled with Mobile Devices , 2008 .

[10]  Doug A. Bowman,et al.  Tech-note: rapMenu: Remote Menu Selection Using Freehand Gestural Input , 2008, 2008 IEEE Symposium on 3D User Interfaces.

[11]  Joaquim A. Jorge,et al.  Mockup builder: direct 3D modeling on and above the surface in a continuous interaction space , 2012, Graphics Interface.

[12]  Eve Edelstein,et al.  CaveCAD: Architectural design in the CAVE , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[13]  A. Raposo,et al.  A tablet-based 3D interaction tool for virtual engineering environments , 2013, VRCAI '13.

[14]  Michael Gervautz,et al.  The Personal Interaction Panel – a Two‐Handed Interface for Augmented Reality , 1997, Comput. Graph. Forum.

[15]  Mark R Mine ISAAC: a meta- system for virtual environments , 1997, Comput. Aided Des..

[16]  Aaron J. Quigley,et al.  Proximal and distal selection of widgets: designing distributed UI for mobile interaction with large display , 2011, Mobile HCI.

[17]  Robert W. Lindeman,et al.  Coordinated 3D interaction in tablet- and HMD-based hybrid virtual environments , 2014, SUI.

[18]  Peng Song,et al.  WYSIWYF: exploring and annotating volume data with a tangible handheld device , 2011, CHI.

[19]  Perttu Hämäläinen,et al.  Immersive 3D modeling with Blender and off-the-shelf hardware , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[20]  Sriram Subramanian,et al.  Talking about tactile experiences , 2013, CHI.

[21]  Daniel F. Keefe,et al.  rAir flow menus: toward reliable 3D gestural input for radial marking menus , 2009, SIGGRAPH '09.

[22]  Arun Yoganandan,et al.  Comparison of a two-handed interface to a wand interface and a mouse interface for fundamental 3D tasks , 2012, 2012 IEEE Symposium on 3D User Interfaces (3DUI).

[23]  Taku Komura,et al.  Topology matching for fully automatic similarity estimation of 3D shapes , 2001, SIGGRAPH.

[24]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[25]  Kevin Ponto,et al.  SculptUp: A rapid, immersive 3D modeling environment , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[26]  J. Michael Moshell,et al.  A Two-Handed Interface for Object Manipulation in Virtual Environments , 1995, Presence: Teleoperators & Virtual Environments.

[27]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[28]  Mark Mine,et al.  Making VR work: building a real-world immersive modeling application in the virtual world , 2014, SUI.

[29]  Enrico Rukzio,et al.  PointerPhone: Using Mobile Phones for Direct Pointing Interactions with Remote Displays , 2013, INTERACT.

[30]  Saul Greenberg,et al.  The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.