Fuzzy Logic Based Sensor Fusion for Accurate Tracking

Accuracy and tracking update rates play a vital role in determining the quality of Augmented Reality(AR) and Virtual Reality(VR) applications. Applications like soldier training, gaming, simulations & virtual conferencing need a high accuracy tracking with update frequency above 20Hz for an immersible experience of reality. Current research techniques combine more than one sensor like camera, infrared, magnetometers and Inertial Measurement Units (IMU) to achieve this goal. In this paper, we develop and validate a novel algorithm for accurate positioning and tracking using inertial and vision-based sensing techniques. The inertial sensing utilizes accelerometers and gyroscopes to measure rates and accelerations in the body fixed frame and computes orientations and positions via integration. The vision-based sensing uses camera and image processing techniques to compute the position and orientation. The sensor fusion algorithm proposed in this work uses the complementary characteristics of these two independent systems to compute an accurate tracking solution and minimizes the error due to sensor noise, drift and different update rates of camera and IMU. The algorithm is computationally efficient, implemented on a low cost hardware and is capable of an update rate up to 100 Hz. The position and orientation accuracy of the sensor fusion is within 6mm & 1.5°. By using the fuzzy rule sets and adaptive filtering of data, we reduce the computational requirement less than the conventional methods (such as Kalman filtering). We have compared the accuracy of this sensor fusion algorithm with a commercial infrared tracking system. It can be noted that outcome accuracy of this COTS IMU and camera sensor fusion approach is as good as the commercial tracking system at a fraction of the cost.

[1]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[2]  Michael E. Greene,et al.  A Unit Quaternion and Fuzzy Logic Approach to Attitude Estimation , 2007 .

[3]  A.S. Paul,et al.  Dual Kalman filters for autonomous terrain aided navigation in unknown environments , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[4]  Jerzy Z. Sasiadek,et al.  Sensor fusion based on fuzzy Kalman filtering for autonomous robot vehicle , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[5]  Qin Zhang,et al.  KALMAN FILTERING OF DGPS POSITIONS FOR A PARALLEL TRACKING APPLICATION , 2002 .

[6]  Michael E. Greene,et al.  Performance analysis of attitude determination algorithms for low cost attitude heading reference systems , 2010 .

[7]  Warren E. Dixon,et al.  SENSOR FUSION USING FUZZY LOGIC ENHANCED KALMAN FILTER FOR AUTONOMOUS VEHICLE GUIDANCE IN CITRUS GROVES , 2009 .

[8]  S. Chiu,et al.  Applying fuzzy logic to the Kalman filter divergence problem , 1993, Proceedings of IEEE Systems Man and Cybernetics Conference - SMC.

[9]  Fan Xiao,et al.  What is the best fiducial? , 2002, The First IEEE International Workshop Agumented Reality Toolkit,.

[10]  Dieter Schmalstieg,et al.  ARToolKitPlus for Pose Trackin on Mobile Devices , 2007 .

[11]  Greg Welch,et al.  Motion Tracking: No Silver Bullet, but a Respectable Arsenal , 2002, IEEE Computer Graphics and Applications.

[12]  R. Fitzgerald Divergence of the Kalman filter , 1971 .

[13]  Jingmeng Liu,et al.  Multi sensor data fusion method based on fuzzy neural network , 2008, 2008 6th IEEE International Conference on Industrial Informatics.