Real-time cloud-based 3D reconstruction and interaction with a stereo smartphone

In this demonstration, we present a novel cloud-based system for casual users, which enables real-time 3D reconstruction and interaction based on stereo smartphones. The client is an Android application that captures scene in real-world, displays reconstructed 3D model and supports interaction. The server receives captured data from client and reconstructs 3D model in real-time. While users capturing data, both the reconstructed scenes and the marked uncertainty regions are visualized on the phones. So, it allows users to continuously update the reconstructed model in real-time by interaction. Meanwhile our system is implemented based on smartphones, which are accessible to casual users, while many existing systems always require extra or expensive specialized equipment. Also, we proposed multiple innovative methods to achieve continuously repairing model through interaction and address the challenge brought by the phone hardware limitations.

[1]  John G. Apostolopoulos,et al.  Cloud-based depth sensing quality feedback for interactive 3D reconstruction , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[2]  Dieter Schmalstieg,et al.  Augmented Reality Scouting for Interactive 3D Reconstruction , 2007, 2007 IEEE Virtual Reality Conference.

[3]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[4]  Andrew J. Davison,et al.  Live dense reconstruction with a single moving camera , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[5]  Horst Bischof,et al.  Dense reconstruction on-the-fly , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.