Joint analysis of simultaneous EEG and eye tracking data for video images

Purpose Technological innovation has made it possible to review how a film cues particular reactions on the part of the viewers. The purpose of this paper is to capture and interpret visual perception and attention by the simultaneous use of eye tracking and electroencephalography (EEG) technologies. Design/methodology/approach The authors have developed a method for joint analysis of EEG and eye tracking. To achieve this goal, an algorithm was implemented to capture and interpret visual perception and attention by the simultaneous use of eye tracking and EEG technologies. All parameters have been measured as a function of the relationship between the tested signals, which, in turn, allowed for a more accurate validation of hypotheses by appropriately selected calculations. Findings The results of this study revealed a coherence between EEG and eye tracking that are of particular relevance for human perception. Practical implications This paper endeavors both to capture and interpret visual perception and attention by the simultaneous use of eye tracking and EEG technologies. Eye tracking provides a powerful real-time measure of viewer region of interest. EEG technologies provides data regarding the viewer’s emotional states while watching the movie. Originality/value The approach in this paper is distinct from similar studies because it takes into account the integration of the eye tracking and EEG technologies. This paper provides a method for determining a fully functional video introspection system.

[1]  M A Just,et al.  A theory of reading: from eye fixations to comprehension. , 1980, Psychological review.

[2]  E Donchin,et al.  The mental prosthesis: assessing the speed of a P300-based brain-computer interface. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[3]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[5]  Ali Bahramisharif,et al.  Brain-computer interfacing using modulations of alpha activity induced by covert shifts of attention , 2011, Journal of NeuroEngineering and Rehabilitation.

[6]  M. V. Gerven,et al.  Attention modulations of posterior alpha as a control signal for two-dimensional brain–computer interfaces , 2009, Journal of Neuroscience Methods.

[7]  Xintao Hu,et al.  Predicting Movie Trailer Viewer's “Like/Dislike” via Learned Shot Editing Patterns , 2016, IEEE Transactions on Affective Computing.

[8]  R. A. Mcfarland Relationship of skin temperature changes to the emotions accompanying music , 1985, Biofeedback and self-regulation.

[9]  Loong Fah Cheong,et al.  Affective understanding in film , 2006, IEEE Trans. Circuits Syst. Video Technol..

[10]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[11]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[12]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Sebastian Bosse,et al.  Toward a Direct Measure of Video Quality Perception Using EEG , 2012, IEEE Transactions on Image Processing.

[14]  Tom F. Price,et al.  Does Negative Affect Always Narrow and Positive Affect Always Broaden the Mind? Considering the Influence of Motivational Intensity on Cognitive Scope , 2013 .

[15]  Dariusz Puchala Approximating the KLT by Maximizing the Sum of Fourth-Order Moments , 2013, IEEE Signal Processing Letters.