Object Properties Influence Visual Guidance of Motor Actions

The dynamic nature of the real world poses challenges for predicting where best to allocate gaze during object interactions. The same object may require different visual guidance depending on its current or upcoming state. Here, we explore how object properties (the material and shape of objects) and object state (whether it is full of liquid, or to be set down in a crowded location) influence visual supervision while setting objects down, which is an element of object interaction that has been relatively neglected in the literature. In a liquid pouring task, we asked participants to move empty glasses to a filling station; to leave them empty, half fill, or completely fill them with water; and then move them again to a tray. During the first putdown (when the glasses were all empty), visual guidance was determined only by the type of glass being set down—with more unwieldy champagne flutes being more likely to be guided than other types of glasses. However, when the glasses were then filled, glass type no longer mattered, with the material and fill level predicting whether the glasses were set down with visual supervision: full, glass material containers were more likely to be guided than empty, plastic ones. The key finding from this research is that the visual system responds flexibly to dynamic changes in object properties, likely based on predictions of risk associated with setting-down the object unsupervised by vision. The factors that govern these mechanisms can vary within the same object as it changes state.

[1]  Mary Hayhoe,et al.  Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task. , 2013, Journal of vision.

[2]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[3]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[4]  J. Tresilian Attention in action or obstruction of movement? A kinematic analysis of avoidance behavior in prehension , 1998, Experimental Brain Research.

[5]  Jeffrey Dean,et al.  Control of human arm movements in two dimensions: paths and joint control in avoiding simple linear obstacles , 2004, Experimental Brain Research.

[6]  H. Bülthoff,et al.  Separate neural pathways for the visual analysis of object shape in perception and prehension , 1994, Current Biology.

[7]  S. Jackson,et al.  Are non-relevant objects represented in working memory? The effect of non-target objects on reach and grasp kinematics , 2004, Experimental Brain Research.

[8]  M. Jeannerod Mechanisms of visuomotor coordination: A study in normal and brain-damaged subjects , 1986, Neuropsychologia.

[9]  G. Rizzolatti,et al.  Influence of different types of grasping on the transport component of prehension movements , 1991, Neuropsychologia.

[10]  Angelo Cangelosi,et al.  Visual and linguistic cues to graspable objects , 2013, Experimental Brain Research.

[11]  F. Vitu,et al.  The role of object affordances and center of gravity in eye movements toward isolated daily-life objects. , 2015, Journal of vision.

[12]  Michael S Landy,et al.  Combining Priors and Noisy Visual Cues in a Rapid Pointing Task , 2006, The Journal of Neuroscience.

[13]  A. Patla,et al.  Behaviour and Gaze Analyses During a Goal-Directed Locomotor Task , 2009, Quarterly journal of experimental psychology.

[14]  Rob Ellis,et al.  The role of visual attention in action priming , 2007, Quarterly journal of experimental psychology.

[15]  Konrad Paul Kording,et al.  Bayesian integration in sensorimotor learning , 2004, Nature.

[16]  Chris R Sims,et al.  Adaptive Allocation of Vision under Competing Task Demands , 2011, The Journal of Neuroscience.

[17]  Michael I. Jordan,et al.  Obstacle Avoidance and a Perturbation Sensitivity Model for Motor Planning , 1997, The Journal of Neuroscience.

[18]  H. C. Dijkerman,et al.  The influence of object identity on obstacle avoidance reaching behaviour. , 2014, Acta psychologica.

[19]  William H. Warren,et al.  On-line and model-based approaches to the visual control of action , 2015, Vision Research.

[20]  M. Jeannerod Intersegmental coordination during reaching at natural visual objects , 1981 .

[21]  M. Jeannerod,et al.  Influence of object position and size on human prehension movements , 1997, Experimental Brain Research.

[22]  M. Hayhoe Vision Using Routines: A Functional Account of Vision , 2000 .

[23]  Alasdair D. F. Clarke,et al.  People Are Unable to Recognize or Report on Their Own Eye Movements , 2017, Quarterly journal of experimental psychology.

[24]  Scott T. Grafton,et al.  Graspable objects grab attention when the potential for action is recognized , 2003, Nature Neuroscience.

[25]  M. Goodale,et al.  The visual brain in action , 1995 .

[26]  Mary M Hayhoe,et al.  Visual memory and motor planning in a natural task. , 2003, Journal of vision.

[27]  Rebecca M Foerster,et al.  Saccadic eye movements in a high-speed bimanual stacking task: changes of attentional control during learning and automatization. , 2011, Journal of vision.

[28]  Neil W. Roach,et al.  Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration , 2006, Proceedings of the Royal Society B: Biological Sciences.

[29]  Martin Volker Butz,et al.  Goal-oriented gaze strategies afforded by object interaction , 2015, Vision Research.

[30]  James R. Tresilian,et al.  The effect of obstacle position on reach-to-grasp movements , 2001, Experimental Brain Research.

[31]  Loni Desanghere,et al.  “Graspability” of objects affects gaze patterns during perception and action tasks , 2011, Experimental Brain Research.

[32]  M. Gentilucci Object motor representation and reaching–grasping control , 2002, Neuropsychologia.

[33]  M. Mon-Williams,et al.  A test between two hypotheses and a possible third way for the control of prehension , 2000, Experimental Brain Research.

[34]  D. Bates,et al.  Fitting Linear Mixed-Effects Models Using lme4 , 2014, 1406.5823.

[35]  P. Cavanagh,et al.  Looking ahead: the perceived direction of gaze shifts before the eyes move. , 2009, Journal of vision.

[36]  Matthew J. Stainer,et al.  Coordinating Vision and Action in Natural Behaviour: Differences in Spatiotemporal Coupling in Everyday Tasks , 2017, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[37]  R. Johansson,et al.  Gaze behavior when learning to link sequential action phases in a manual task. , 2014, Journal of vision.

[38]  P. R. Davidson,et al.  Widespread access to predictive models in the motor system: a short review , 2005, Journal of neural engineering.

[39]  M. Jeannerod,et al.  Mental imaging of motor activity in humans , 1999, Current Opinion in Neurobiology.

[40]  Katherine L. Roberts,et al.  Action relations facilitate the identification of briefly-presented objects , 2011, Attention, perception & psychophysics.

[41]  R. J. van Beers,et al.  Integration of proprioceptive and visual position-information: An experimentally supported model. , 1999, Journal of neurophysiology.

[42]  Mary Hayhoe,et al.  The role of prediction in catching balls. , 2004 .

[43]  Martin V. Butz,et al.  Gaze strategies in object identification and manipulation , 2013, CogSci.

[44]  D. Ballard,et al.  Modelling the role of task in the control of gaze , 2009, Visual cognition.

[45]  D. Ballard,et al.  Memory Representations in Natural Tasks , 1995, Journal of Cognitive Neuroscience.

[46]  James R Tresilian,et al.  Selective attention in reaching: when is an object not a distractor? , 1999, Trends in Cognitive Sciences.

[47]  J R Flanagan,et al.  The Role of Internal Models in Motion Planning and Control: Evidence from Grip Force Adjustments during Movements of Hand-Held Loads , 1997, The Journal of Neuroscience.