Pseudo-transparent tablet based on 3D feature tracking

This demonstration shows geometrically consistent image rendering that realizes pseudo-transparency in tablet-based augmented reality. The rendering method is based on the homography estimated by feature tracking and face detection using on-board rear and front cameras, respectively. This configuration is the most practical for typical off-the-shelf tablets in the sense that it does not require any special devices or designed environments. Although the local misalignment of images rectified by homography is an unavoidable artifact in a non-planar scene, the rendered images are globally consistent with the real scene. This is the first demonstration in which such pseudo-transparency can be experienced in an unprepared environment.

[1]  Blair MacIntyre,et al.  Virtual transparency: Introducing parallax view into video see-through AR , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[2]  Takashi Komuro,et al.  Geometrically consistent mobile AR for 3D interaction , 2013, AH.

[3]  Kosuke Sato,et al.  Approximated user-perspective rendering in tablet-based augmented reality , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[4]  Tobias Höllerer,et al.  A hand-held AR magic lens with user-perspective rendering , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).