TrackCap: Enabling Smartphones for 3D Interaction on Mobile Head-Mounted Displays

The latest generation of consumer market Head-mounted displays (HMD) now include self-contained inside-out tracking of head motions, which makes them suitable for mobile applications. However, 3D tracking of input devices is either not included at all or requires to keep the device in sight, so that it can be observed from a sensor mounted on the HMD. Both approaches make natural interactions cumbersome in mobile applications. TrackCap, a novel approach for 3D tracking of input devices, turns a conventional smartphone into a precise 6DOF input device for an HMD user. The device can be conveniently operated both inside and outside the HMD's field of view, while it provides additional 2D input and output capabilities.

[1]  Yoshifumi Kitamura,et al.  GyroWand: IMU-based Raycasting for Augmented Reality Head-Mounted Displays , 2015, SUI.

[2]  Enrico Rukzio,et al.  WatchVR: Exploring the Usage of a Smartwatch for Interaction in Mobile Virtual Reality , 2018, CHI Extended Abstracts.

[3]  Robert J. Teather,et al.  Visual aids in 3D point selection experiments , 2014, SUI.

[4]  Joseph S. Dumas,et al.  Comparison of three one-question, post-task usability questionnaires , 2009, CHI.

[5]  Xiang Cao,et al.  LensMouse: augmenting the mouse with an interactive touch display , 2010, CHI.

[6]  Holger Regenbrecht,et al.  Urban Pointing: Browsing Situated Media Using Accurate Pointing Interfaces , 2018, CHI Extended Abstracts.

[7]  Krzysztof Pietroszek,et al.  TickTockRay: smartwatch-based 3D pointing for smartphone-based virtual reality , 2016, VRST.

[8]  Bernd Fröhlich,et al.  A soft hand model for physically-based manipulation of virtual objects , 2011, 2011 IEEE Virtual Reality Conference.

[9]  Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems , 2018, CHI Extended Abstracts.

[10]  Michael Rohs,et al.  Real-World Interaction with Camera Phones , 2004, UCS.

[11]  Richard Szeliski,et al.  The VideoMouse: a camera-based multi-degree-of-freedom input device , 1999, UIST '99.

[12]  Karin Coninx,et al.  Exploring the Effects of Environment Density and Target Visibility on Object Selection in 3D Virtual Environments , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[13]  Jennifer J. Richler,et al.  Effect size estimates: current use, calculations, and interpretation. , 2012, Journal of experimental psychology. General.

[14]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[15]  Christoph W. Borst,et al.  Realistic virtual grasping , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[16]  Diego Gutierrez,et al.  Gaze-based Interaction for Virtual Environments , 2008, J. Univers. Comput. Sci..

[17]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[18]  Volodymyr V. Kindratenko,et al.  A survey of electromagnetic position tracker calibration techniques , 2005, Virtual Reality.

[19]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[20]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[21]  Andrew W. Fitzgibbon,et al.  Efficient and precise interactive hand tracking through joint, continuous optimization of pose and correspondences , 2016, ACM Trans. Graph..

[22]  Torsten Kuhlen,et al.  Multi-Contact Grasp Interaction for Virtual Environments , 2008, J. Virtual Real. Broadcast..

[23]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[24]  Robert J. Teather,et al.  An arm-mounted inertial controller for 6DOF input: Design and evaluation , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[25]  Gerd Bruder,et al.  Analysis of direct selection in head-mounted display environments , 2014, 2014 IEEE Symposium on 3D User Interfaces (3DUI).

[26]  Gerhard Reitmayr,et al.  iOrb-Unifying Command and 3 D Input for Mobile Augmented Reality , 2005 .

[27]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[28]  V. Lepetit,et al.  EPnP: An Accurate O(n) Solution to the PnP Problem , 2009, International Journal of Computer Vision.

[29]  Maud Marchal,et al.  The god-finger method for improving 3D interaction with virtual objects through simulation of contact area , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[30]  Greg Welch,et al.  The HiBall Tracker: high-performance wide-area tracking for virtual and augmented environments , 1999, VRST '99.

[31]  Andrew T. Duchowski,et al.  Gaze- vs. hand-based pointing in virtual environments , 2003, CHI Extended Abstracts.

[32]  Kosuke Sato,et al.  A wearable mixed reality with an on-board projector , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[33]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.