Bayesian Morphology : Fast Unsupervised

We consider the problems of image segmentation and classi cation, and image restoration when the true image is made up of a small number of (unordered) colors. Our emphasis is on both performance and speed; speed has become increasingly important for analyzing large images and multispectral images with many bands, processing large image databases, real-time or near real-time image analysis, and the online analysis of video. Bayesian image analysis (Geman and Geman 1984) provides an elegant solution to these problems, but it is computationally expensive, and the solutions it provides may be sensitive to unrealistic global properties of the models on which it is based. The ICM algorithm (Besag 1986) is faster and based on the local properties of the models underlying Bayesian image analysis; parameter estimation is performed iteratively via pseudo-likelihood. Mathematical morphology (Matheron 1975) is faster again and is widely considered to perform well, but lacks a statistical basis; method selection (analogous to parameter estimation) is done in a rather ad hoc manner. We propose Bayesian morphology, a synthesis of these methods that attempts to combine the speed of mathematical morphology with the principled statistical basis of ICM. The key observation is that when the original image is discrete (or if an initial segmentation has been carried out), then, assuming a Potts model for the true scene and channel transmission noise, (1) the ICM algorithm is equivalent to a form of mathematical morphology; and (2) the segmentation is insensitive to the precise values of the model parameters. Unlike in standard Bayesian images analysis and ICM, it is feasible to do maximum likelihood estimation of the parameters in this setting. For grey-level or multispectral images, we propose an initial segmentation based on the EM algorithm for a mixture model of the marginal distribution of the pixels. The resulting algorithm is much faster than ICM, with gains that increase for more bands and larger images, and has good performance in experiments and for real examples.

[1]  G. Celeux,et al.  Regularized Gaussian Discriminant Analysis through Eigenvalue Decomposition , 1996 .

[2]  A. Raftery,et al.  Model-based Gaussian and non-Gaussian clustering , 1993 .

[3]  J. Besag Statistical Analysis of Non-Lattice Data , 1975 .

[4]  Charles J. Geyer,et al.  Reweighting Monte Carlo Mixtures , 1991 .

[5]  Xavier Descombes,et al.  Fine Structures Preserving Markov Model for Image Processing , 1995 .

[6]  F. Papangelou GIBBS MEASURES AND PHASE TRANSITIONS (de Gruyter Studies in Mathematics 9) , 1990 .

[7]  H. Heijmans Morphological image operators , 1994 .

[8]  Wang,et al.  Nonuniversal critical dynamics in Monte Carlo simulations. , 1987, Physical review letters.

[9]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  J. Besag On the Statistical Analysis of Dirty Pictures , 1986 .

[11]  John K. Goutsias Unilateral approximation of Gibbs random field images , 1991, CVGIP Graph. Model. Image Process..

[12]  Azriel Rosenfeld,et al.  Scene Labeling by Relaxation Operations , 1976, IEEE Transactions on Systems, Man, and Cybernetics.

[13]  Jean Serra,et al.  Image Analysis and Mathematical Morphology , 1983 .

[14]  Basilis Gidas,et al.  Parameter Estimation for Gibbs Distributions from Partially Observed Data , 1992 .

[15]  Hideki Noda,et al.  NOTE: Blind Restoration of Degraded Binary Markov Random Field Images , 1996, CVGIP Graph. Model. Image Process..

[16]  J. Besag,et al.  Bayesian image restoration, with two applications in spatial statistics , 1991 .

[17]  W. Qian,et al.  Stochastic relaxations and em algorithms for markov random fields , 1992 .

[18]  A. Raftery,et al.  Detecting features in spatial point processes with clutter via model-based clustering , 1998 .

[19]  Hans-Otto Georgii,et al.  Gibbs Measures and Phase Transitions , 1988 .

[20]  Håkon Tjelmeland,et al.  Markov Random Fields with Higher‐order Interactions , 1998 .

[21]  C. Geyer,et al.  Constrained Monte Carlo Maximum Likelihood for Dependent Data , 1992 .

[22]  Josiane Zerubia,et al.  Estimation of Markov random field prior parameters using Markov chain Monte Carlo maximum likelihood , 1999, IEEE Trans. Image Process..

[23]  Gérard Govaert,et al.  Gaussian parsimonious clustering models , 1995, Pattern Recognit..

[24]  Robert G. Fovell,et al.  Consensus Clustering of U.S. Temperature and Precipitation Data , 1997 .

[25]  G. Matheron Random Sets and Integral Geometry , 1976 .

[26]  C. Ji,et al.  A consistent model selection procedure for Markov random fields based on penalized pseudolikelihood , 1996 .