DualCAD: Integrating Augmented Reality with a Desktop GUI and Smartphone Interaction

Head-Mounted Displays (HMDs) combined with 3-or-more Degree-of-Freedom (DoF) input enable rapid manipulation of stereoscopic 3D content. However, such input is typically performed with hands in midair and therefore lacks precision and stability. Also, recent consumer-grade HMDs suffer from limited angular resolution and/or limited field-of-view as compared to a desktop monitor. We present the DualCAD system that implements two solutions to these problems. First, the user may freely switch at runtime between an augmented reality HMD mode, and a traditional desktop mode with precise 2D mouse input and an external desktop monitor. Second, while in the augmented reality HMD mode, the user holds a smartphone in their non-dominant hand that is tracked with 6 DoF, allowing it to be used as a complementary high-resolution display as well as an alternative input device for stylus or multitouch input. Two novel bimanual interaction techniques that leverage the properties of the smartphone are presented. We also report initial user feedback.

[1]  Andrew W. Fitzgibbon,et al.  Accurate, Robust, and Flexible Real-time Hand Tracking , 2015, CHI.

[2]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[3]  Wolfgang Hürst,et al.  Gesture-based interaction via finger tracking for mobile augmented reality , 2011, Multimedia Tools and Applications.

[4]  Ryugo Kijima,et al.  Transition between virtual environment and workstation environment with projective head mounted display , 1997, Proceedings of IEEE 1997 Annual International Symposium on Virtual Reality.

[5]  Stefanie Zollmann,et al.  Poster: Spatially augmented tape drawing , 2009, 2009 IEEE Symposium on 3D User Interfaces.

[6]  Joaquim A. Jorge,et al.  Mockup builder: direct 3D modeling on and above the surface in a continuous interaction space , 2012, Graphics Interface.

[7]  Colin Ware,et al.  Evaluating stereo and motion cues for visualizing information nets in three dimensions , 1996, TOGS.

[8]  Michael J. McGuffin,et al.  Pop-up depth views for improving 3D target acquisition , 2011, Graphics Interface.

[9]  Daniel C. Robbins,et al.  Three-dimensional widgets , 1992, I3D '92.

[10]  Tovi Grossman,et al.  HybridSpace: Integrating 3D freehand input and stereo viewing into traditional desktop applications , 2014, 2014 IEEE Symposium on 3D User Interfaces (3DUI).

[11]  Carolina Cruz-Neira,et al.  Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE , 2023 .

[12]  George W. Fitzmaurice,et al.  The Rockin'Mouse: integral 3D manipulation on a plane , 1997, CHI.

[13]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[14]  Jeremy R. Cooperstock,et al.  Did "Minority Report" Get It Wrong? Superiority of the Mouse over 3D Input Devices in a 3D Placement Task , 2009, INTERACT.

[15]  William Buxton,et al.  User learning and performance with marking menus , 1994, CHI 1994.

[16]  Robert W. Lindeman,et al.  Coordinated 3D interaction in tablet- and HMD-based hybrid virtual environments , 2014, SUI.

[17]  Jean-Daniel Fekete,et al.  Semantic pointing for object picking in complex 3D environments , 2008, Graphics Interface.

[18]  Dieter Schmalstieg,et al.  Mathematics and geometry education with collaborative augmented reality , 2003, Comput. Graph..

[19]  Raimund Dachselt,et al.  Tangible displays for the masses: spatial interaction with handheld displays by using consumer depth cameras , 2014, Personal and Ubiquitous Computing.

[20]  Michael Hoffmann,et al.  Hyve-3D and rethinking the "3D cursor": unfolding a natural interaction model for remote and local co-design in VR , 2015, SIGGRAPH Studio.

[21]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[22]  Mark Billinghurst,et al.  Interaction techniques for HMD-HHD hybrid AR systems , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[23]  Ken Perlin,et al.  T(ether): spatially-aware handhelds, gestures and proprioception for multi-user 3D modeling and animation , 2014, SUI.

[24]  Pourang Irani,et al.  Consumed endurance: a metric to quantify arm fatigue of mid-air interactions , 2014, CHI.

[25]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[26]  Marc Olano,et al.  3DM: a three dimensional modeler using a head-mounted display , 1992, I3D '92.

[27]  Joe Tullio,et al.  Usability analysis of 3D rotation techniques , 1997, UIST '97.

[28]  Dieter Schmalstieg,et al.  “Studierstube”: An environment for collaboration in augmented reality , 1998, Virtual Reality.

[29]  Fabrice Matulic,et al.  Sensing techniques for tablet+stylus interaction , 2014, UIST.

[30]  Levent Burak Kara,et al.  Conceptual design and modification of freeform surfaces using dual shape representations in augmented reality environments , 2012, Comput. Aided Des..

[31]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[32]  Michael Gervautz,et al.  The Personal Interaction Panel – a Two‐Handed Interface for Augmented Reality , 1997, Comput. Graph. Forum.

[33]  Wang Hui-nan Multi-Finger Gestural Interaction with 3D Volumetric Displays , 2008 .

[34]  Ramakrishnan Mukundan,et al.  3D gesture interaction for handheld augmented reality , 2014, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[35]  T. Ichikawa,et al.  Egocentric Object Manipulation in Virtual Environments: Empirical Evaluation of Interaction Techniques , 1998, Comput. Graph. Forum.

[36]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, International Conference on Human Factors in Computing Systems.

[37]  Martin Hachet,et al.  Toucheo: multitouch and stereo combined in a seamless workspace , 2011, UIST.

[38]  Pourang Irani,et al.  Sensing Tablet Grasp + Micro-mobility for Active Reading , 2015, UIST.

[39]  Csaba Antonya,et al.  A comparative evaluation of human interaction for design and assembly of 3D CAD models in desktop and immersive environments , 2012 .

[40]  Doug A. Bowman,et al.  Separating the effects of level of immersion and 3D interaction techniques , 2006, VRST '06.

[41]  Doug A. Bowman,et al.  Designing Animal Habitats Within an Immersive VE , 1998, IEEE Computer Graphics and Applications.

[42]  Hiroshi Ishii,et al.  SpaceTop: integrating 2D and spatial 3D interactions in a see-through desktop environment , 2013, CHI.

[43]  Ravin Balakrishnan,et al.  Exploring bimanual camera control and object manipulation in 3D graphics interfaces , 1999, CHI '99.

[44]  David Kim,et al.  MixFab: a mixed-reality environment for personal fabrication , 2014, CHI.

[45]  Mark Mine,et al.  Making VR work: building a real-world immersive modeling application in the virtual world , 2014, SUI.

[46]  Mark Billinghurst,et al.  Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[47]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[48]  Doug A. Bowman,et al.  3D User Interfaces Using Tracked Multi-touch Mobile Devices , 2012, ICAT/EGVE/EuroVR.

[49]  Andrew S. Forsberg,et al.  Two pointer input for 3D interaction , 1997, SI3D.

[50]  Robert W. Lindeman,et al.  Towards usable VR: an empirical study of user interfaces for immersive virtual environments , 1999, CHI '99.

[51]  Peng Song,et al.  WYSIWYF: exploring and annotating volume data with a tangible handheld device , 2011, CHI.