Saliency difference based objective evaluation method for a superimposed screen of the HUD with various background

The head-up display (HUD) is an emerging device which can project information on a transparent screen. The HUD has been used in airplanes and vehicles, and it is usually placed in front of the operator's view. In the case of the vehicle, the driver can see not only various information on the HUD but also the backgrounds (driving environment) through the HUD. However, the projected information on the HUD may interfere with the colors in the background because the HUD is transparent. For example, a red message on the HUD will be less noticeable when there is an overlap between it and the red brake light from the front vehicle. As the first step to solve this issue, how to evaluate the mutual interference between the information on the HUD and backgrounds is important. Therefore, this paper proposes a method to evaluate the mutual interference based on saliency. It can be evaluated by comparing the HUD part cut from a saliency map of a measured image with the HUD image.

[1]  Randall E. Bailey,et al.  Flight test comparison between enhanced vision (FLIR) and synthetic vision systems , 2005, SPIE Defense + Commercial Sensing.

[2]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[3]  S. Süsstrunk,et al.  Frequency-tuned salient region detection , 2009, CVPR 2009.

[4]  Richard F. Haines,et al.  Head-Up Display (HUD) Utility, II: Runway to Hud Transitions Monitoring Eye Focus and Decision Times , 1985 .

[5]  Sabine Süsstrunk,et al.  Frequency-tuned salient region detection , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Marcus Tönnis,et al.  Using Glance Behaviour to Evaluate ACC Driver Controls in a Driving Simulator , 2006 .

[7]  R. Schroer The avionics handbook [Book Review] , 2001, IEEE Aerospace and Electronic Systems Magazine.

[8]  Christof Koch,et al.  Modeling attention to salient proto-objects , 2006, Neural Networks.

[9]  Izabela Bodus-Olkowska,et al.  Integrated presentation of navigational data in a mobile navigation system for inland waters with the use of HUD , 2017 .

[10]  N. Mcbrien,et al.  The influence of cognition and age on accommodation, detection rate and response times when using a car head‐up display (HUD) , 1998, Ophthalmic & physiological optics : the journal of the British College of Ophthalmic Opticians.

[11]  Xavier Giró-i-Nieto,et al.  End-to-end Convolutional Network for Saliency Prediction , 2015, ArXiv.

[12]  Aykut Erdem,et al.  Visual saliency estimation by integrating features using multiple kernel learning , 2013, ArXiv.

[13]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[14]  V. Charissis,et al.  Evaluation of Prototype Automotive Head-Up Display Interface: Testing Driver's Focusing Ability through a VR Simulation , 2007, 2007 IEEE Intelligent Vehicles Symposium.

[15]  Hiroaki Shinkai,et al.  Visibility of Head up Display (HUD) for Automobiles , 1991 .