Effects of supervised and unsupervised categorization on visual and haptic object representations
暂无分享,去创建一个
The way we perceive objects can be shaped by how we learn to categorize them, such as increasing the saliency of diagnostic dimensions, but most evidence of this comes from unimodal studies in vision. Does category learning have comparable effects when objects are perceived using touch? In this study, subjects learned to categorize a set of 25 novel, 3D objects which varied parametrically in shape and texture, using either vision or touch, and then provided similarity ratings on the objects. Multidimensional scaling (MDS) was used to construct a perceptual stimulus space and estimate the relative importance of shape and texture dimensions for each subject. The effects of categorization were quantified by comparing dimension weights when similarity ratings were performed 1) without prior categorization experience, 2) after an unsupervised categorization task, and 3) after a supervised task in which subjects learned groupings based on texture. In preliminary results (5 subjects per condition), significant differences between unimodal dimension weights found in condition 1 (shape dominating for vision; shape and texture equally weighted for touch) were not observed in conditions 2 and 3. These findings suggest that externally as well as internally-guided categorization could serve as a mechanism for reducing initial differences in unimodal representations of novel objects.
[1] Robert L. Goldstone. The role of similarity in categorization: providing a groundwork , 1994, Cognition.
[2] R. Klatzky,et al. Haptic exploration in the presence of vision. , 1993, Journal of experimental psychology. Human perception and performance.
[3] Douglas L. Medin,et al. Context theory of classification learning. , 1978 .