In-Place Natural and Effortless Navigation for Large Industrial Scenarios

Here we address the problem of navigating in virtual environments with fixed display visualizations (e.g. projections and tvs) by using natural gestures. Gesture metaphors have proven to be a powerful tool for human computer interaction. Examples arise from smartphones to state of the art projects like the Holodesk (from Microsoft Research). However, regarding the use of gestures for navigation in virtual environments, a specific limitation arises in respect to the user movimentation in the real space. The gestures should provide the user a way of turning the virtual camera direction without losing the view of the screen. Moreover, the user must be able to move long distances in the virtual environment without trespassing real world boundaries and without becoming fatigued.

[1]  Noëlle Carbonell,et al.  An experimental study of future “natural” multimodal human-computer interaction , 1993, CHI '93.

[2]  David Akers,et al.  Wizard of Oz for participatory design: inventing a gestural interface for 3D selection of neural pathway estimates , 2006, CHI Extended Abstracts.

[3]  Niels Henze,et al.  Free-hand gestures for music playback: deriving gestures with a user-centred process , 2010, MUM.

[4]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[5]  Bernd Fröhlich,et al.  3D User Interfaces , 2009, IEEE Computer Graphics and Applications.

[6]  Stefania Serafin,et al.  Tapping-In-Place: Increasing the naturalness of immersive walking-in-place locomotion through novel gestural input , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[7]  Philippe Fuchs,et al.  User-defined gestural interaction: A study on gesture memorization , 2013, 2013 IEEE Symposium on 3D User Interfaces (3DUI).

[8]  Erran Carmel,et al.  PD and joint application design: a transatlantic comparison , 1993, CACM.

[9]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[10]  Maud Marchal,et al.  Walking in a Cube: Novel Metaphors for Safely Navigating Large Virtual Environments in Restricted Real Workspaces , 2012, IEEE Transactions on Visualization and Computer Graphics.

[11]  Hans-Peter Seidel,et al.  Real-Time Body Tracking with One Depth Camera and Inertial Sensors , 2013, 2013 IEEE International Conference on Computer Vision.

[12]  Jinxiang Chai,et al.  Accurate realtime full-body motion capture using a single depth camera , 2012, ACM Trans. Graph..

[13]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[14]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[15]  Laurent Grisoni,et al.  Drag'n Go: Simple and fast navigation in virtual environment , 2012, 2012 IEEE Symposium on 3D User Interfaces (3DUI).