Using synchronized eye and motion tracking to determine high-precision eye-movement patterns during object-interaction tasks.

This study explores the role that vision plays in sequential object interactions. We used a head-mounted eye tracker and upper-limb motion capture to quantify visual behavior while participants performed two standardized functional tasks. By simultaneously recording eye and motion tracking, we precisely segmented participants' visual data using the movement data, yielding a consistent and highly functionally resolved data set of real-world object-interaction tasks. Our results show that participants spend nearly the full duration of a trial fixating on objects relevant to the task, little time fixating on their own hand when reaching toward an object, and slightly more time-although still very little-fixating on the object in their hand when transporting it. A consistent spatial and temporal pattern of fixations was found across participants. In brief, participants fixate an object to be picked up at least half a second before their hand arrives at the object and stay fixated on the object until they begin to transport it, at which point they shift their fixation directly to the drop-off location of the object, where they stay fixated until the object is successfully released. This pattern provides additional evidence of a common system for the integration of vision and object interaction in humans, and is consistent with theoretical frameworks hypothesizing the distribution of attention to future action targets as part of eye and hand-movement preparation. Our results thus aid the understanding of visual attention allocation during planning of object interactions both inside and outside the field of view.

[1]  Nancy Kanwisher,et al.  Neural Representations Integrate the Current Field of View with the Remembered 360° Panorama in Scene-Selective Cortex , 2016, Current Biology.

[2]  Liam D. Kaufman,et al.  Human fMRI Reveals That Delayed Action Re-Recruits Visual Perception , 2013, PloS one.

[3]  H. Kennedy,et al.  Two Cortical Systems for Reaching in Central and Peripheral Vision , 2005, Neuron.

[4]  C. MacKenzie,et al.  Integration of visual information and motor output in reaching and grasping: The contributions of peripheral and central vision , 1990, Neuropsychologia.

[5]  Heiner Deubel,et al.  Deployment of visual attention before sequences of goal-directed hand movements , 2006, Vision Research.

[6]  Eileen Kowler Eye movements: The past 25years , 2011, Vision Research.

[7]  Mohammad M D Sobuh,et al.  Visuomotor behaviours when using a myoelectric prosthesis , 2014, Journal of NeuroEngineering and Rehabilitation.

[8]  R. Johansson,et al.  Eye–Hand Coordination during Learning of a Novel Visuomotor Task , 2005, The Journal of Neuroscience.

[9]  H. Deubel,et al.  Attentional landscapes in reaching and grasping , 2010, Vision Research.

[10]  David N. Lee,et al.  Where we look when we steer , 1994, Nature.

[11]  D. Ballard,et al.  Eye guidance in natural vision: reinterpreting salience. , 2011, Journal of vision.

[12]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[13]  Jason P. Gallivan,et al.  Three-dimensional reach trajectories as a probe of real-time decision-making between multiple competing targets , 2014, Front. Neurosci..

[14]  V. Gullapalli,et al.  Visual Information and Object Size in the Control of Reaching. , 1996, Journal of motor behavior.

[15]  Mary M Hayhoe,et al.  Task and context determine where you look. , 2016, Journal of vision.

[16]  H. Bekkering,et al.  Ocular gaze is anchored to the target of an ongoing pointing movement. , 2000, Journal of neurophysiology.

[17]  M. Hayhoe Vision Using Routines: A Functional Account of Vision , 2000 .

[18]  M. Goodale,et al.  The visual brain in action , 1995 .

[19]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[20]  Daniel Baldauf,et al.  The Posterior Parietal Cortex Encodes in Parallel Both Goals for Double-Reach Sequences , 2008, The Journal of Neuroscience.

[21]  G. Wood,et al.  Examining the Spatiotemporal Disruption to Gaze When Using a Myoelectric Prosthetic Hand , 2018, Journal of motor behavior.

[22]  D. Pélisson,et al.  From Eye to Hand: Planning Goal-directed Movements , 1998, Neuroscience & Biobehavioral Reviews.

[23]  K. Nakayama,et al.  Hidden cognitive states revealed in choice reaching tasks , 2009, Trends in Cognitive Sciences.

[24]  M. Land Vision, eye movements, and natural behavior , 2009, Visual Neuroscience.

[25]  E. Freedman Coordination of the eyes and head during visual orienting , 2008, Experimental Brain Research.

[26]  Daniel C. Richardson,et al.  Eye Tracking: Characteristics And Methods , 2004 .

[27]  M. Goodale Transforming vision into action , 2011, Vision Research.

[28]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[29]  H. Collewijn,et al.  Gaze-shift dynamics in two kinds of sequential looking tasks , 1997, Vision Research.

[30]  J. van der Steen,et al.  The Effect of Neurodegeneration on Visuomotor Behavior in Alzheimer's Disease and Parkinson's Disease. , 2016, Motor control.

[31]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[32]  J. Pelz,et al.  Oculomotor behavior and perceptual strategies in complex tasks , 2001, Vision Research.

[33]  R. C. Langford How People Look at Pictures, A Study of the Psychology of Perception in Art. , 1936 .

[34]  M. Goodale,et al.  The effects of delay on the kinematics of grasping , 1999, Experimental Brain Research.

[35]  A. Kingstone,et al.  Cognitive Ethology: a new approach for studying human cognition. , 2008, British journal of psychology.

[36]  Heiner Deubel,et al.  Attentional Selection of Multiple Goal Positions Before Rapid Hand Movement Sequences: An Event-related Potential Study , 2009, Journal of Cognitive Neuroscience.

[37]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[38]  K. Nakayama,et al.  On the Functional Role of Implicit Visual Memory for the Adaptive Deployment of Attention Across Scenes , 2000 .

[39]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[40]  A. Bronstein,et al.  Influence of postural constraints on eye and head latency during voluntary rotations , 2013, Vision Research.

[41]  Mary M Hayhoe,et al.  Visual memory and motor planning in a natural task. , 2003, Journal of vision.

[42]  Rebecca M Foerster,et al.  Saccadic eye movements in a high-speed bimanual stacking task: changes of attentional control during learning and automatization. , 2011, Journal of vision.