Slow and Smooth: A Bayesian theory for the combination of local motion signals in human vision

In order to estimate the motion of an object, the visual system needs to combine multiple local measurements, each of which carries some degree of ambiguity. We present a model of motion perception whereby measurements from di erent image regions are combined according to a Bayesian estimator | the estimated motion maximizes the posterior probability assuming a prior favoring slow and smooth velocities. In reviewing a large number of previously published phenomena we nd that the Bayesian estimator predicts a wide range of psychophysical results. This suggests that the seemingly complex set of illusions arise from a single computational strategy that is optimal under reasonable assumptions. Copyright c Massachusetts Institute of Technology, 1998 This report describes research done at the Center for Biological and Computational Learning and the Department of Brain and Cognitive Sciences of the Massachusetts Institute of Technology. Support for the Center is provided in part by a grant from the National Science Foundation under contract ASC{9217041. The work was also supported by NEI R01 EY11005 to E. H. Adelson

[1]  H. Wallach,et al.  Circles and derived figures in rotation. , 1956, The American journal of psychology.

[2]  W. Reichardt,et al.  Autocorrelation, a principle for the evaluation of sensory information by the central nervous system , 1961 .

[3]  A. N. Tikhonov,et al.  Solutions of ill-posed problems , 1977 .

[4]  S. Ullman The Interpretation of Visual Motion , 1979 .

[5]  D Marr,et al.  Directional selectivity and its use in early visual processing , 1981, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[6]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[7]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[8]  E. Adelson,et al.  Phenomenal coherence of moving visual patterns , 1982, Nature.

[9]  Ellen C. Hildreth,et al.  Measurement of Visual Motion , 1984 .

[10]  E. Adelson,et al.  The analysis of moving visual patterns , 1985 .

[11]  Tomaso Poggio,et al.  Probabilistic Solution of Ill-Posed Problems in Computational Vision , 1987 .

[12]  K. Nakayama,et al.  The aperture problem—I. Perception of nonrigidity and motion direction in translating sinusoidal lines , 1988, Vision Research.

[13]  K. Nakayama,et al.  The aperture problem—II. Spatial integration of velocity information along contours , 1988, Vision Research.

[14]  T. Poggio,et al.  A parallel algorithm for real-time computation of optical flow , 1989, Nature.

[15]  K. Nakayama,et al.  Occlusion and the solution to the aperture problem for motion , 1989, Vision Research.

[16]  H. Wilson,et al.  Perceived direction of moving two-dimensional patterns , 1990, Vision Research.

[17]  J. B. Mulligan,et al.  Effect of contrast on the perceived direction of a moving plaid , 1990, Vision Research.

[18]  Michael S. Landy,et al.  Theories for the Visual Perception of Local Velocity and Coherent Motion , 1991 .

[19]  Tomaso Poggio,et al.  Computational vision and regularization theory , 1985, Nature.

[20]  James T. Todd,et al.  The perception of globally coherent motion , 1992, Vision Research.

[21]  H. Wilson,et al.  A psychophysically motivated model for two-dimensional motion perception , 1992, Visual Neuroscience.

[22]  Eero P. Simoncelli,et al.  A computational model for perception of two-dimensional pattern ve-locities , 1992 .

[23]  Hugh R. Wilson,et al.  Perceived direction of moving two-dimensional patterns depends on duration, contrast and eccentricity , 1992, Vision Research.

[24]  M. Shiffrar,et al.  Different motion sensitive units are involved in recovering the direction of moving lines , 1993, Vision Research.

[25]  P. Wenderoth,et al.  The effect of interactions between one-dimensional component gratings on two-dimensional motion perception , 1993, Vision Research.

[26]  O. Braddick Segmentation versus integration in visual motion processing , 1993, Trends in Neurosciences.

[27]  Eero P. Simoncelli Distributed representation and analysis of visual motion , 1993 .

[28]  Shaul Hochstein,et al.  Isolating the effect of one-dimensional motion signals on the perceived direction of moving two-dimensional objects , 1993, Vision Research.

[29]  Darren Burke,et al.  The contribution of one-dimensional motion mechanisms to the perceived direction of drifting plaids and their aftereffects , 1994, Vision Research.

[30]  David J. Heeger,et al.  Model of visual motion sensing , 1994 .

[31]  T. Sejnowski,et al.  A selection model for motion processing in area MT of primates , 1995, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[32]  Tomaso A. Poggio,et al.  Regularization Theory and Neural Networks Architectures , 1995, Neural Computation.

[33]  L. Bowns Evidence for a Feature Tracking Explanation of Why Type II Plaids Move in the Vector Sum Direction at Short Durations , 1996, Vision Research.

[34]  P. Thompson,et al.  Speed estimates from grating patches are not contrast-normalized , 1996, Vision Research.

[35]  R. Shapley,et al.  “On the Visually Perceived Direction of Motion” by Hans Wallach: 60 Years Later , 1996 .

[36]  Edward H. Adelson,et al.  The extraction of Spatio-temporal Energy in Human and Machine Vision , 1997 .

[37]  Yair Weiss,et al.  Smoothness in layers: Motion segmentation using nonparametric mixture estimation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[38]  Eero P. Simoncelli,et al.  A model of neuronal responses in visual area MT , 1998, Vision Research.