Inertial Guided Visual Sample Consensus based wearable orientation estimation for body motion tracking

This paper presents a novel orientation estimate scheme using Inertial Guided Visual SAmple Consensus (IGVSAC) strategy for human body motion tracking. Unlike the traditional visual based orientation estimation methods where outliers among image-pair putative correspondences are removed based on hypothesize-and-verify models such as costly RANSAC, our approach novelly exploits motion prior information (i.e., rotation and translation) deduced from quick-response Inertial Measurement Unit (IMU) as the initial body pose to assist visual sensor in removing hidden outliers, which effectively overcomes the major drawback of those sample- and-consensus models. In addition, our IGVSAC algorithm is able to ensure the estimation accuracy even in the presence of large quantity of outliers among correspondences. Apart from that, the estimated orientation from visual sensor is, in turn, able to correct the IMU estimates using feedback control tactic, which can address IMU inherent long-term drifting issue. Extensive experiments are conducted to verify the effectiveness and robustness of our IGVSAC algorithm. The comparisons with highly accurate VICON Optical Motion Tracking System prove that our orientation estimate system is quite suitable for human body joint capturing.