A touchable virtual screen interaction system with handheld Kinect camera

In recent years, augmented reality (AR) has been a research hotpot. It emphasizes that virtual objects and real scene are merged with the correct perspective relationship. However, most of existing AR systems simply adds virtual objects on real scene image. It means that if a real hand is in front of the virtual object, the virtual object is still cover the hand with wrong occlusion relationship. On the other hand, most systems just present virtual information, preventing user from touching or interacting with the virtual object.. In this paper, we build an AR system achieving mutual occlusion and interaction. Instead of using structure from motion method, we make use of the depth image of Kinect camera to integrate a 3D space volume and track the camera pose as well. Then we propose a layer rendering method according to the depth relationship to implement mutual occlusion fusion. At last, we fulfill collision detection and interaction through a fast voxel detection method. We define the neighbor cube for every virtual surface point and implement the detection in parallel with GPU.

[1]  Alvy Ray Smith,et al.  Color gamut transform pairs , 1978, SIGGRAPH.

[2]  Kok-Lim Low Linear Least-Squares Optimization for Point-to-Plane ICP Surface Registration , 2004 .

[3]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[4]  Andrew J. Davison,et al.  DTAM: Dense tracking and mapping in real-time , 2011, 2011 International Conference on Computer Vision.

[5]  Mark Fiala,et al.  ARTag, a fiducial marker system using digital techniques , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[6]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[7]  Marc Levoy,et al.  A volumetric method for building complex models from range images , 1996, SIGGRAPH.

[8]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[9]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[10]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[11]  David Kim,et al.  HoloDesk: direct 3d interactions with a situated see-through display , 2012, CHI.

[12]  Reinhard Koch,et al.  MixIn3D: 3D Mixed Reality with ToF-Camera , 2009, Dyn3D.