Marker’s position estimation under uncontrolled environment for augmented reality

The optical tracking information in manufacturing can provide valuable support and time saving for autonomous operations, but ill environment conditions prevent a better performance of vision systems. In this work, a method for estimating object position under semi-controlled environment where lighting conditions change dynamically is proposed. This method incorporates regression analysis that combines light measurement and an augmented reality (AR) system. Augmented reality (AR) combines virtual objects with real environment. Furthermore, every AR application uses a video camera to capture an image including a marker in order to place a virtual object, which gives user an enriched environment. Using a tracking system to estimate the marker’s position with respect to the camera coordinate frame is needed to positioning a virtual object. Most research studies on tracking system for AR are under controlled environment. The problem is that tracking systems for markers are sensitive to variations in lighting conditions in the real environment. To solve this problem, a method is proposed to better estimate a marker position based on regression analysis, where lighting conditions are taken into account. Our approach improves the accuracy of the marker position estimation under different lighting conditions. The experimental data obtained under a laboratory context with changes on light condition are fitted with this approach with an accuracy of 99 %.

[1]  Dieter Schmalstieg,et al.  ARToolKitPlus for Pose Trackin on Mobile Devices , 2007 .

[2]  Blair MacIntyre,et al.  DART: a toolkit for rapid design exploration of augmented reality experiences , 2004, UIST '04.

[3]  Borko Furht,et al.  Handbook of Augmented Reality , 2011 .

[4]  Mark Fiala,et al.  ARTag, a fiducial marker system using digital techniques , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[5]  Fumio Kishino,et al.  Augmented reality: a class of displays on the reality-virtuality continuum , 1995, Other Conferences.

[6]  Simon J. Julier,et al.  A Method for Predicting Marker Tracking Error , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[7]  Julian Ryde,et al.  Mutual localization: Two camera relative 6-DOF pose estimation from reciprocal fiducial observation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Ichiro Sakuma,et al.  Biomedical Signal Processing and Control Coarse-to-fine Dot Array Marker Detection with Accurate Edge Localization for Stereo Visual Tracking , 2022 .

[9]  Adam Herout,et al.  Five Shades of Grey for Fast and Reliable Camera Pose Estimation , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Kazuyo Iwamoto,et al.  Combination of optical shape measurement and augmented reality for task support: I. Accuracy of position and pose detection by ARToolKit , 2010 .

[11]  Jorma Laaksonen,et al.  An augmented reality interface to contextual information , 2011, Virtual Reality.

[12]  Jean-Yves Didier,et al.  A performance study for camera pose estimation using visual marker based tracking , 2010, Machine Vision and Applications.