Comparing haptic, visual, and computational similarity-based maps of novel, 3D objects

Similarity has been proposed as an organizational principle for representing objects in the brain [1-2]. But how does similarity vary as a function of perceptual modality? Here, we investigated this question by parametrically varying two object properties, shape and texture, gathered similarity ratings between pairs of objects, and used these to obtain modality-specific stimulus maps. Comparing the map obtained from visual similarity ratings against the map obtained by haptic ratings revealed differences in the weightings of shape and texture in the two modalities. We then compared these perceptual maps against maps derived from various computational measures of similarity to search for features/computations which may explain the perceptual similarities.