Multi-sensor fusion for interactive visual computing in mixed environment

Mobile Augmented Reality, as an emerging application for handheld devices, explores more natural interactions in real and virtual environments. For the purpose of accurate system response and manipulating objects in real-time, extensive efforts have been made to estimate six Degree-of-Freedom and extract robust feature to track. However there are still quite a lot challenges today in achieving rich user experience. To allow for a seamless transition from outdoor to indoor service, we investigated and integrated various sensing techniques of GPS, wireless, Inertial Measurement Units, and optical. A parallel tracking and matching scheme is presented to address the speed-accuracy tradeoff issue. Two prototypes, fine-scale mirror world navigation and context-aware trouble shooting, have been developed to demonstrate the suitability of our approach.