A low-complexity yet accurate calibration method for automotive augmented reality head-up displays

Automotive augmented reality head-up displays (AR-HUDs) superimpose driving related information with the real world in the direct sight of the driver. A key prerequisite for an immersive AR experience is a highly precise calibration. State-of-the-art methods require large targets and a lot of space in front of the vehicle, or special complex equipment, which is inconvenient in both factories and workshops. In this paper, we propose a low-complexity yet accurate calibration method using only a small sheet of patterned paper as the target, which is laid directly on the windscreen. The full field of view (FOV) can be calibrated, with the optical distortion corrected by extracted warping maps. The changing views of drivers are considered by interpolating both projection parameters and distortion models. The angular reprojection error falls within 0.04°, while the run-time is limited up to 1 minute per viewpoint. Our method shows high applicability in the automotive industry because of both reduced target complexity and competitive reprojection errors. Moreover, due to the reduced effort and simplified equipment, our method opens a way for customers to recalibrate their AR-HUDs themselves.

[1]  Alois Ferscha,et al.  A New Visualization Concept for Navigation Systems , 2004, User Interfaces for All.

[2]  Gerhard Rigoll,et al.  Contact-analog information representation in an automotive head-up display , 2008, ETRA '08.

[3]  Xubo Yang,et al.  A Calibration Method for On-Vehicle AR-HUD System Using Mixed Reality Glasses , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[4]  Yuichi Ohta,et al.  VISUAL NAVIGATION SYSTEM ON WINDSHIELD HEAD-UP DISPLAY , 2006 .

[5]  Faten Ben Abdallah,et al.  Augmented Reality Based Traffic Sign Recognition for Improved Driving Safety , 2015, Nets4Cars/Nets4Trains/Nets4Aircraft.

[6]  Martin Klemm,et al.  High accuracy pixel-wise spatial calibration of optical see-through glasses , 2017, Comput. Graph..

[7]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[8]  Andreas Butz,et al.  First Steps towards a View Management Concept for Large-sized Head-up Displays with Continuous Depth , 2016, AutomotiveUI.

[9]  Marc Necker,et al.  A Calibration Method For Automotive Augmented Reality Head-Up Displays Based On A Consumer-Grade Mono-Camera , 2019, 2019 IEEE International Conference on Image Processing (ICIP).

[10]  Harald Wuest,et al.  A camera-based calibration for automotive augmented reality Head-Up-Displays , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[11]  Luis E. Ortiz,et al.  Depth Data Error Modeling of the ZED 3D Vision Sensor from Stereolabs , 2018, ELCVIA Electronic Letters on Computer Vision and Image Analysis.

[12]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Vwani P. Roychowdhury,et al.  Algorithms for coplanar camera calibration , 2000, Machine Vision and Applications.

[14]  Mainak Biswas,et al.  47.3: Invited Paper: World Fixed Augmented‐Reality HUD for Smart Notifications , 2015 .

[15]  Marc Necker,et al.  A calibration method for automotive augmented reality head-up displays using a chessboard and warping maps , 2020, International Conference on Machine Vision.

[16]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.