Non-deceiving features in fused optical flow gyroscopes

Standard gyroscopes found in mobile phones tend to have bias in output angular rate. Nowadays these mobiles are also equipped with high resolution cameras and multimedia-ready processors capable to calculate optical flow from shot images. In order to lower the gyroscope bias, thus leading to a less drifting integrated angle, camera based optical flow values can aid regular gyroscope values to produce a robust output. In this paper an adaptive algorithm is presented to fuse optical flow angular rates with gyroscope output, in a way to eliminate deceiving tracked features on camera images, caused by moving object in the field of view of the camera. Measurements were recorded with a robotic arm using an Android phone. Further simulations were also tested in MATLAB to investigate the limits of this algorithm.