Disentangling the contributions of grasp and action representations in the recognition of manipulable objects

There is an increasing evidence that the action properties of manipulable objects can play a role in object recognition, as objects with similar action properties can facilitate each other’s recognition [Helbig et al. Exp Brain Res 174:221–228, 2006]. However, it is unclear whether this modulation is driven by the actions involved in using the object or the grasps afforded by the objects, because these factors have been confounded in previous studies. Here, we attempted to disentangle the relative contributions of the action and grasp properties by using a priming paradigm in which action and grasp similarity between two objects were varied orthogonally. We found that target tools with similar grasp properties to the prime tool were named more accurately than those with dissimilar grasps. However, naming accuracy was not affected by the similarity of action properties between the prime and target tools. This suggests that knowledge about how an object is used is not automatically accessed when identifying a manipulable object. What are automatically accessed are the transformations necessary to interact directly with the object—i.e., the manner in which one grasps the object.

[1]  R. Ellis,et al.  Action priming by briefly presented objects. , 2004, Acta psychologica.

[2]  J. Hermsdörfer,et al.  Grasping tools: Effects of task and apraxia , 2009, Neuropsychologia.

[3]  G. Vingerhoets,et al.  Conceptual and physical object qualities contribute differently to motor affordances , 2009, Brain and Cognition.

[4]  F. Osiurak,et al.  Grasping the affordances, understanding the reasoning: toward a dialectical theory of human tool use. , 2010, Psychological review.

[5]  Irina M Harris,et al.  Repetition blindness reveals differences between the representations of manipulable and nonmanipulable objects. , 2012, Journal of experimental psychology. Human perception and performance.

[6]  J. Decety,et al.  Does visual perception of object afford action? Evidence from a neuroimaging study , 2002, Neuropsychologia.

[7]  F. Osiurak,et al.  Different constraints on grip selection in brain-damaged patients: Object use versus object transport , 2008, Neuropsychologia.

[8]  L. Buxbaum,et al.  Distinctions between manipulation and function knowledge of objects: evidence from functional magnetic resonance imaging. , 2005, Brain research. Cognitive brain research.

[9]  Alfonso Caramazza,et al.  Unconscious processing dissociates along categorical lines , 2008, Proceedings of the National Academy of Sciences.

[10]  Denis G. Pelli,et al.  ECVP '07 Abstracts , 2007, Perception.

[11]  L. Buxbaum,et al.  Knowledge of object manipulation and object function: dissociations in apraxic and nonapraxic subjects , 2002, Brain and Language.

[12]  Rob Ellis,et al.  Manual asymmetries in visually primed grasping , 2006, Experimental Brain Research.

[13]  M. Kiefer,et al.  Action observation can prime visual object recognition , 2009, Experimental Brain Research.

[14]  Speak Louder,et al.  Actions speak louder. , 1979, Hospital progress.

[15]  G. Vingerhoets,et al.  Neural correlates of pantomiming familiar and unfamiliar tools: Action semantics versus mechanical problem solving? , 2011, Human brain mapping.

[16]  K. James Review: The Ecological Approach to Visual Perception by James J. Gibson , 1981 .

[17]  G. Gigli,et al.  Degraded Semantic Knowledge And Accurate Object Use , 2007, Cortex.

[18]  F. Osiurak,et al.  Re-examining the gesture engram hypothesis. New perspectives on apraxia of tool use , 2011, Neuropsychologia.

[19]  G. Goldenberg Apraxia and the parietal lobes , 2009, Neuropsychologia.

[20]  M. Brett,et al.  Actions Speak Louder Than Functions: The Importance of Manipulability and Action in Tool Representation , 2003, Journal of Cognitive Neuroscience.

[21]  F. Osiurak,et al.  Object utilization and object usage: A single-case study , 2008, Neurocase.

[22]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[23]  D G Pelli,et al.  The VideoToolbox software for visual psychophysics: transforming numbers into movies. , 1997, Spatial vision.

[24]  Gregory Króliczak,et al.  A common network in the left cerebral hemisphere represents planning of tool use pantomimes and familiar intransitive gestures at the hand-independent level. , 2009, Cerebral cortex.

[25]  S. Anderson,et al.  Attentional processes link perception and action , 2002, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[26]  Uta Noppeney,et al.  The neural systems of tool and action semantics: A perspective from functional imaging , 2008, Journal of Physiology-Paris.

[27]  Markus Graf,et al.  The role of action representations in visual object recognition , 2006, Experimental Brain Research.

[28]  Gaetano Tieri,et al.  Where does an object trigger an action? An investigation about affordances in space , 2010, Experimental Brain Research.

[29]  L. Buxbaum Ideomotor Apraxia: a Call to Action , 2001, Neurocase.

[30]  R. Ellis,et al.  Micro-affordance: the potentiation of components of action by seen objects. , 2000, British journal of psychology.

[31]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[32]  Bradford Z. Mahon,et al.  The Role of the Dorsal Visual Processing Stream in Tool Identification , 2010, Psychological science.

[33]  W. Davis The Ecological Approach to Visual Perception , 2012 .

[34]  Alex Martin,et al.  Representation of Manipulable Man-Made Objects in the Dorsal Stream , 2000, NeuroImage.

[35]  A. Mack,et al.  Potentiation of action by undetected affordant objects , 2008 .