Bridging the gap between eye tracking and crowdsourcing

Visual attention constitutes a very important feature of the human visual system (HVS). Every day when watching videos, images or browsing the Internet, people are confronted with more information than they are able to process, and analyze only part of the information in front of them. In parallel, crowdsourcing has become a particularly hot topic, enabling to scale subjective experiments to a large crowd with diversity in terms of nationalities, social background, age, etc. This paper describes a novel framework with the aim to bridge these two fields, by providing a new way of measurements of user's experience in a subjective crowdsourcing experiment. This study goes beyond self-reported methods, and provide a new kind of information for the context of crowdsourcing: visual attention. The results show that it is possible to estimate visual attention, in a non-intrusive manner and without using self-reported methods or specialized equipment, with a precision as high as 14.1% in the horizontal axis and 17.9% in the vertical axis. This accuracy is sufficient for many kinds of measurements that can be efficiently executed only in non-controlled environments..

[1]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[2]  Alexandre N. Tuch,et al.  Location matters, especially for non-salient features-An eye-tracking study on the effects of web object placement on different types of websites , 2013, Int. J. Hum. Comput. Stud..

[3]  Phuoc Tran-Gia,et al.  Best Practices for QoE Crowdtesting: QoE Assessment With Crowdsourcing , 2014, IEEE Transactions on Multimedia.

[4]  Lihi Zelnik-Manor,et al.  Crowdsourcing Gaze Data Collection , 2012, ArXiv.

[5]  Bill Tomlinson,et al.  Who are the crowdworkers?: shifting demographics in mechanical turk , 2010, CHI Extended Abstracts.

[6]  Mohan M. Trivedi,et al.  Head Pose Estimation in Computer Vision: A Survey , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Phuoc Tran-Gia,et al.  Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com , 2011, 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing.

[8]  Daniel McDuff,et al.  Crowdsourcing Facial Responses to Online Videos , 2012, IEEE Transactions on Affective Computing.

[9]  Ryen W. White,et al.  User see, user point: gaze and cursor alignment in web search , 2012, CHI.

[10]  Phuoc Tran-Gia,et al.  Quantification of YouTube QoE via Crowdsourcing , 2011, 2011 IEEE International Symposium on Multimedia.

[11]  Scott P. Robertson,et al.  Proceedings of the SIGCHI Conference on Human Factors in Computing Systems , 1991 .

[12]  Simon Baker,et al.  Lucas-Kanade 20 Years On: A Unifying Framework , 2004, International Journal of Computer Vision.

[13]  Chin-Laung Lei,et al.  Quadrant of euphoria: a crowdsourcing platform for QoE assessment , 2010, IEEE Network.

[14]  Nikos Fakotakis,et al.  An Accurate Eye Center Localization Method for Low Resolution Color Imagery , 2012, 2012 IEEE 24th International Conference on Tools with Artificial Intelligence.

[15]  Aniket Kittur,et al.  Crowdsourcing user studies with Mechanical Turk , 2008, CHI.