Maximum Entropy Inference and Stimulus Generalization
暂无分享,去创建一个
Abstract Maximum entropy inference is a method for estimating a probability distribution based on limited information expressed in terms of the moments of that distribution. This paper presents such a maximum entropy characterization of Shepard's theory of generalization. Shepard's theory assumes that an object has an important consequence for an individual only if it falls in a connected set, called the consequential region, in the individual's representational space. The assumption yields a generalization probability that decays exponentially with an appropriate psychological distance metric—either the city-block or the Euclidean, depending on the correlational structure between extensions of the consequential region along the dimensions. In this note we show that a generalization function similar to that derived by Shepard (1987) can be obtained by applying maximum entropy inference on limited information about interstimulus distances between two objects having a common consequence. In particular, we show that different shapes of equal generalization contours may be interpreted as optimal utilization—in the maximum entropy sense—of the correlation structure of stimulus dimensions, similar to the explanation by Shepard's theory.