Transitional Augmented Reality navigation for live captured scenes

Augmented Reality (AR) applications require knowledge about the real world environment in which they are used. This knowledge is often gathered while developing the AR application and stored for future uses of the application. Consequently, changes to the real world lead to a mismatch between the previously recorded data and the real world. New capturing techniques based on dense Simultaneous Localization and Mapping (SLAM) not only allow users to capture real world scenes at run-time, but also enables them to capture changes of the world. However, instead of using previously recorded and prepared scenes, users must interact with an unprepared environment. In this paper, we present a set of new interaction techniques that support users in handling captured real world environments. The techniques present virtual viewpoints of the scene based on a scene analysis and provide natural transitions between the AR view and virtual viewpoints. We demonstrate our approach with a SLAM based prototype that allows us to capture a real world scene and describe example applications of our system.

[1]  Maneesh Agrawala,et al.  Non-invasive interactive visualization of dynamic architectural environments , 2003, I3D '03.

[2]  Steven K. Feiner,et al.  Mobile augmented reality interaction techniques for authoring situated media on-site , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[3]  Ivan Poupyrev,et al.  The MagicBook: a transitional AR interface , 2001, Comput. Graph..

[4]  Dieter Schmalstieg,et al.  Zooming interfaces for augmented reality browsers , 2010, Mobile HCI.

[5]  Bruce H. Thomas,et al.  Augmented Viewport: An action at a distance technique for outdoor AR using distant and zoom lens cameras , 2010, International Symposium on Wearable Computers (ISWC) 2010.

[6]  Jens Grubert,et al.  Exploring the design of hybrid interfaces for augmented posters in public spaces , 2012, NordiCHI.

[7]  Mark Billinghurst,et al.  Automatic zooming interface for tangible augmented reality applications , 2012, VRCAI '12.

[8]  Susanto Rahardja,et al.  Object recognition by discriminative combinations of line segments and ellipses , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  Mark Billinghurst,et al.  Moving Between Contexts - A User Evaluation of a Transitional Interface , 2008 .

[10]  Dieter Schmalstieg,et al.  Exploring Distant Objects with Augmented Reality , 2013, EGVE/EuroVR.

[11]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[12]  Steven K. Feiner,et al.  Quick viewpoint switching for manipulating virtual objects in hand-held augmented reality using stored snapshots , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[13]  Bruce H. Thomas,et al.  Augmented reality user interfaces and techniques for outdoor modelling , 2003, I3D '03.

[14]  Holger Regenbrecht,et al.  Techniques for view transition in multi-camera outdoor environments , 2010, Graphics Interface.

[15]  Steven K. Feiner,et al.  Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system , 1999, Comput. Graph..

[16]  Jin Sung Choi,et al.  Freeze-Set-Go interaction method for handheld mobile augmented reality environments , 2009, VRST '09.

[17]  Michael M. Kazhdan,et al.  Poisson surface reconstruction , 2006, SGP '06.