Streaming mobile augmented reality on mobile phones

Continuous recognition and tracking of objects in live video captured on a mobile device enables real-time user interaction. We demonstrate a streaming mobile augmented reality system with 1 second latency. User interest is automatically inferred from camera movements, so the user never has to press a button. Our system is used to identify and track book and CD covers in real time on a phone's viewfinder. Efficient motion estimation is performed at 30 frames per second on a phone, while fast search through a database of 20,000 images is performed on a server.

[1]  Joo-Hwee Lim,et al.  Object Identification and Retrieval from Efficient Image Matching: Snap2Tell with the STOIC Dataset , 2005, AIRS.

[2]  David Nistér,et al.  Scalable Recognition with a Vocabulary Tree , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[3]  Bernd Girod,et al.  Outdoors augmented reality on mobile phone using loxel-based visual feature organization , 2008, MIR '08.

[4]  Joo-Hwee Lim,et al.  Object identification and retrieval from efficient image matching. Snap2Tell with the STOIC dataset , 2007, Inf. Process. Manag..

[5]  Natasha Gelfand,et al.  Viewfinder Alignment , 2008, Comput. Graph. Forum.

[6]  Christopher Hunt,et al.  Notes on the OpenSURF Library , 2009 .

[7]  Sunil Arya,et al.  Algorithms for fast vector quantization , 1993, [Proceedings] DCC `93: Data Compression Conference.

[8]  O. H. Schmitt,et al.  A thermionic trigger , 1938 .