A mobile location search system with active query sensing

How should the second query be taken once the first query fails in mobile location search based on visual recognition? In this demo, we describe a mobile search system with a unique Active Query Sensing (AQS) function to intelligently guide the mobile user to take a successful second query. This suggestion is built upon a scalable visual matching system covering over 0.3 million street view reference images in New York City, where each location is associated with multiple surrounding views and panorama. In online search, once the initial search result fails, the system will perform online analysis and suggest the mobile user to turn to the most discriminative viewing angle to take the second visual query, from which the search performance is expected to greatly improve. The AQS suggestion is based on both offline salient view discovery and online viewing angle prediction and intelligent turning decision. Our experiments show our AVS can improve the mobile location search with a performance gain as high as 100%, reducing the failure rate to only 12% after taking the second visual query.

[1]  A. Torralba,et al.  Matching and Predicting Street Level Images , 2010 .

[2]  David Nistér,et al.  Scalable Recognition with a Vocabulary Tree , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[3]  Rongrong Ji,et al.  Active query sensing for mobile location search , 2011, ACM Multimedia.

[4]  Bernd Girod,et al.  Mobile Visual Search , 2011, IEEE Signal Processing Magazine.