An Integrated MEMS IMU/Camera System for Pedestrian Indoor Navigation Using Smartphones

Pedestrian navigation remains a difficulty in the navigation feild, especially for the indoor environment. Previous research demonstrated that the integration of camera and IMU is able to provide the navigation information in indoor environment. In this research, we employ the iphone4 from Apple Inc. as the platform for pedestrian indoor navigation. The iphone4 contains a 5-megapixel camera, three-axis accelerometers and three-axis gyroscopes. The Allan Variance method will be employed in this paper to characterize various types of error terms in both the accelerometers and gyroscopes. This process will generate new knowledge on the Smartphone inertial sensor errors. A tightly coupled integration algorithm is implemented using an extended Kalman filter to fuse the image data and the inertial data to derive the optimal navigation solution. The vision aiding information will be retrieved from successive images captured by the iphone4 camera with a frame rate, such as 25 Hz. A standard computer vision algorithm will be adopted for image features detection and matching. An indoor field test is conducted to evaluate the navigation performance of the proposed algorithm. The results demonstrate significant improvement on the navigation performance of the proposed system comparing to the stand-alone INS.