The Influence of Different AOI Models in Source Code Comprehension Analysis

In source code comprehension, researchers frequently analyze eye movement data in order to find patterns to test hypotheses. For such an analysis, fixations are often assigned to objects of a stimulus, like tokens, code lines or code regions, through areas of interest (AOI). There are multiple ways to define an AOI which can have a broad impact on the analysis of the eye movement data. In this paper, we analyzed the impact of choosing two different AOI models, both as extreme points from a methodical point of view. We propose one AOI model and compare it to the AOI model from the EMIP dataset. Within the data, we found indications, that one AOI model captures less AOI transitions compared to the other model. A qualitative investigation showed, that some of these AOI transitions can be important for understanding the viewing and comprehension strategy of the participant. In conclusion, we argue, that every researcher should report the chosen AOI model and the particular AOI definitions used in a study, to understand their research. Additionally, we suggest a simple algorithm to test, which AOI model configuration will capture a majority of data points.

[1]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[2]  Akito Monden,et al.  Analyzing individual performance of source code review using reviewers' eye movement , 2006, ETRA.

[3]  J. E. Russo Eye Fixations as a Process Trace , 2019, A Handbook of Process Tracing Methods.

[4]  Marcus Nyström,et al.  A vector-based, multidimensional scanpath similarity measure , 2010, ETRA.

[5]  Caroline Jay,et al.  An investigation of the effects of n-gram length in scanpath analysis for eye-tracking research , 2018, ETRA.

[6]  Jacob L. Orquin,et al.  Areas of Interest as a Signal Detection Problem in Behavioral Eye-Tracking Research , 2016 .

[7]  Katharina Scheiter,et al.  Scanpath comparison in medical image reading skills of dental students: distinguishing stages of expertise development , 2018, ETRA.

[8]  Michael Burch,et al.  Visualization of Eye Tracking Data: A Taxonomy and Survey , 2017, Comput. Graph. Forum.

[9]  Miguel A. Costa-Gomes,et al.  Cognition and Behavior in Normal-Form Games: An Experimental Study , 1998 .

[10]  Teresa Busjahn,et al.  Developing Coding Schemes for Program Comprehension using Eye Movements , 2014, PPIG.

[11]  Andrew Begel,et al.  Eye tracking in computing education , 2014, ICER '14.

[12]  Martha E. Crosby,et al.  How do we read algorithms? A case study , 1990, Computer.

[13]  K. Rayner The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search , 2009, Quarterly journal of experimental psychology.

[14]  Andrew Begel,et al.  Eye Movements in Code Reading: Relaxing the Linear Order , 2015, 2015 IEEE 23rd International Conference on Program Comprehension.