Interacting with target tracking algorithms in a gaze-enhanced motion video analysis system

Motion video analysis is a challenging task, particularly if real-time analysis is required. It is therefore an important issue how to provide suitable assistance for the human operator. Given that the use of customized video analysis systems is more and more established, one supporting measure is to provide system functions which perform subtasks of the analysis. Recent progress in the development of automated image exploitation algorithms allow, e.g., real-time moving target tracking. Another supporting measure is to provide a user interface which strives to reduce the perceptual, cognitive and motor load of the human operator for example by incorporating the operator’s visual focus of attention. A gaze-enhanced user interface is able to help here. This work extends prior work on automated target recognition, segmentation, and tracking algorithms as well as about the benefits of a gaze-enhanced user interface for interaction with moving targets. We also propose a prototypical system design aiming to combine both the qualities of the human observer’s perception and the automated algorithms in order to improve the overall performance of a real-time video analysis system. In this contribution, we address two novel issues analyzing gaze-based interaction with target tracking algorithms. The first issue extends the gaze-based triggering of a target tracking process, e.g., investigating how to best relaunch in the case of track loss. The second issue addresses the initialization of tracking algorithms without motion segmentation where the operator has to provide the system with the object’s image region in order to start the tracking algorithm.

[1]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[2]  Wolfgang Krüger,et al.  Image exploitation algorithms for reconnaissance and surveillance with UAV , 2010, Defense + Commercial Sensing.

[3]  Wolfgang Krüger,et al.  Collaborative real-time motion video analysis by human observer and image exploitation algorithms , 2015, Defense + Security Symposium.

[4]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[5]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[6]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Jürgen Beyerer,et al.  Gaze-based moving target acquisition in real-time full motion video , 2016, ETRA.

[8]  Michael Teutsch,et al.  Detection, Segmentation, and Tracking of Moving Objects in UAV Videos , 2012, 2012 IEEE Ninth International Conference on Advanced Video and Signal-Based Surveillance.

[9]  Yi Wu,et al.  Online Object Tracking: A Benchmark , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Rui Caseiro,et al.  High-Speed Tracking with Kernelized Correlation Filters , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Andreas Paepcke,et al.  Improving the accuracy of gaze input for interaction , 2008, ETRA.

[12]  Simone Calderara,et al.  Visual Tracking: An Experimental Survey , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.