Capturing contextual dependencies in medical imagery using hierarchical multi-scale models

In this paper we summarize our results for two classes of hierarchical multi-scale models that exploit contextual information for detection of structure in mammographic imagery. The first model, the hierarchical pyramid neural network (HPNN), is a discriminative model which is capable of integrating information either coarse-to-fine or fine-to-coarse for microcalcification and mass detection. The second model, the hierarchical image probability (HIP) model, captures short-range and contextual dependencies through a combination of coarse-to-fine factoring and a set of hidden variables. The HIP model, being a generative model, has broad utility, and we present results for classification, synthesis and compression of mammographic mass images. The two models demonstrate the utility of the hierarchical multi-scale framework for computer assisted detection and diagnosis.

[1]  M. Giger,et al.  Digital Radiography , 1993, Acta radiologica.

[2]  K Doi,et al.  Computerized detection of clustered microcalcifications in digital mammograms using a shift-invariant artificial neural network. , 1994, Medical physics.

[3]  Edward H. Adelson,et al.  The Design and Use of Steerable Filters , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Peter J. Burt,et al.  Smart sensing within a pyramid vision machine , 1988, Proc. IEEE.

[5]  Paul Sajda,et al.  Learning contextual relationships in mammograms using a hierarchical pyramid neural network , 2002, IEEE Transactions on Medical Imaging.

[6]  P. Sajda,et al.  Detection, synthesis and compression in mammographic image analysis with a hierarchical image probability model , 2001, Proceedings IEEE Workshop on Mathematical Methods in Biomedical Image Analysis (MMBIA 2001).