Using eye movement data to infer human behavioral intentions

Behavior-directed intentions can be revealed by certain biological signals that precede behaviors. This study used eye movement data to infer human behavioral intentions. Participants were asked to view pictures while operating under different intentions, which necessitated cognitive search and affective appraisal. Intentions regarding the pictures were non-specific or specific, specific intentions were cognitive or affective, and affective intentions were to evaluate either the positive or negative emotions expressed by the individuals depicted. The affective task group made more fixations and had a larger average pupil size than the cognitive task group. The positive appreciation group made more and shorter fixations, on average, than the negative appreciation group. However, support vector machine algorithms revealed low classification accuracy. This was due to large inter-individual variance and psychological factors underlying intentions. We demonstrated improvement in classification accuracy using individual repeated measures data, which helped infer participants' self-selected intentions. We research eye movement patterns to understand human behavior-directed intention.Human intention can be classified into non purpose and specific intention.Specific intention can be classified into cognitive and affective intention.Affective intention can be classified into positively and negatively appreciating intention.It is possible to infer agent's behavior-directed intention by using his or her eye movement patterns.

[1]  Meredith Ringel Morris,et al.  What do you see when you're surfing?: using eye tracking to predict salient regions of web pages , 2009, CHI.

[2]  C. Karen Liu,et al.  Natural User Interface for Physics-Based Character Animation , 2011, MIG.

[3]  T. D. Wilson,et al.  On user studies and information needs , 2006, J. Documentation.

[4]  JeongYon Shim The Design of Knowledge-Emotional Reaction Model considering Personality , 2010 .

[5]  Edward Cutrell,et al.  What are you looking for?: an eye-tracking study of information usage in web search , 2007, CHI.

[6]  P. Locher The usefulness of eye movement recordings to subject an aesthetic episode with visual art to empirical scrutiny , 2006 .

[7]  P. Gollwitzer Goal Achievement: The Role of Intentions , 1993 .

[8]  Andy Watters,et al.  Reasoned/intuitive action :: an individual difference moderator of the attitude-behavior relationship in the 1988 U.S. presidential election. , 1989 .

[9]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[10]  Mohan M. Trivedi,et al.  On the Roles of Eye Gaze and Head Dynamics in Predicting Driver's Intent to Change Lanes , 2009, IEEE Transactions on Intelligent Transportation Systems.

[11]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[12]  Icek Ajzen,et al.  From Intentions to Actions: A Theory of Planned Behavior , 1985 .

[13]  Richard E. Mayer,et al.  An eye movement analysis of highlighting and graphic organizer study aids for learning from expository text , 2014, Comput. Hum. Behav..

[14]  B. L. Driver,et al.  Application of the Theory of Planned Behavior to Leisure Choice. , 1992 .

[15]  Stephanie Cacioppo,et al.  Love Is in the Gaze , 2014, Psychological science.

[16]  Minho Lee,et al.  Effects of search intent on eye-movement patterns in a change detection task , 2015 .

[17]  Allen H. Renear The Digital Library Research Agenda: What's Missing - and How Humanities Textbase Projects Can Help , 1997, D Lib Mag..

[18]  J. Jolles,et al.  Pupil dilation in response preparation. , 2008, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[19]  Myung Jin Chung,et al.  Development of Intention Reading System for the Disabled , 1999 .

[20]  J. Moran,et al.  Sensation and perception , 1980 .

[21]  Luís Moniz Pereira,et al.  State-of-the-art of intention recognition and its use in decision making , 2013, AI Commun..

[22]  W. Levelt,et al.  Pupillary dilation as a measure of attention: a quantitative system analysis , 1993 .

[23]  I. Ajzen The theory of planned behavior , 1991 .

[24]  J. Beatty Task-evoked pupillary responses, processing load, and the structure of processing resources. , 1982 .

[25]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[26]  Satoshi Nakamura,et al.  Search intent estimation from user's eye movements for supporting information seeking , 2012, AVI.

[27]  Frouke Hermens,et al.  Dummy eye measurements of microsaccades: Testing the influence of system noise and head movements on microsaccade detection in a popular video-based eye tracker , 2015 .

[28]  Thomas F. Soapes : The Birth of the Museum: History, Theory, Politics , 1997 .

[29]  Dare A. Baldwin,et al.  Intentions and Intentionality: Foundations of Social Cognition , 2001 .

[30]  Minho Lee,et al.  Human intention recognition based on eyeball movement pattern and pupil size variation , 2014, Neurocomputing.

[31]  Kwangsu Cho,et al.  Outcome Determines Intention: Korean's intention and intentionality judgment , 2014 .

[32]  Ben Shneiderman,et al.  Clarifying Search: A User-Interface Framework for Text Searches , 1997, D Lib Mag..

[33]  Marvin Karlins,et al.  What Every BODY is Saying: An Ex-FBI Agent's Guide to Speed-Reading People , 2008 .

[34]  Andrei Broder,et al.  A taxonomy of web search , 2002, SIGF.

[35]  Johanna K. Kaakinen,et al.  Task effects on eye movements during reading. , 2010, Journal of experimental psychology. Learning, memory, and cognition.

[36]  Amanda Spink,et al.  Determining the informational, navigational, and transactional intent of Web queries , 2008, Inf. Process. Manag..

[37]  Stefan M. Wierda,et al.  Pupil dilation deconvolution reveals the dynamics of attention at high temporal resolution , 2012, Proceedings of the National Academy of Sciences.

[38]  Aldert Vrij,et al.  True and false intentions: asking about the past to detect lies about the future , 2013 .