Sound effect on visual gaze when looking at videos

This paper presents an analysis of sound effect on visual gaze when looking at videos to help to predict eye positions. First, an audio-visual experiment was designed with two groups of participants, with audio-visual (AV) and visual (V) conditions to test the sound effect. We classify the sound in three classes: on-screen speech, non-speech and non-sound. We observe with statistical methods that the sound effect is different depending on the class of sound. Then a comparison of the experimental data and a visual saliency model was carried out, which proves that adding sound to video decreases the accuracy of the prediction of the visual saliency model without a sound pathway. Finally, the result of locating the coordinates of a sound source manually provides a viable aspect of sound pathway for future work.