Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties

Problems of probabilistic inference and decision making under uncertainty commonly involve continuous random variables. Often these are discretized to a few points, to simplify assessments and computations. An alternative approximation is to fit analytically tractable continuous probability distributions. This approach has potential simplicity and accuracy advantages, especially if variables can be transformed first. This paper shows how a minimum relative entropy criterion can drive both transformation and fitting, illustrating with a power and logarithm family of transformations and mixtures of Gaussian (normal) distributions, which allow use of efficient influence diagram methods. The fitting procedure in this case is the well-known EM algorithm. The selection of the number of components in a fitted mixture distribution is automated with an objective that Wades off accuracy and computational cost.

[1]  H. Hartley Maximum Likelihood Estimation from Incomplete Data , 1958 .

[2]  Ross D. Shachter Probabilistic Inference and Influence Diagrams , 1988, Oper. Res..

[3]  V. Hasselblad Estimation of parameters for a mixture of normal distributions , 1966 .

[4]  R. Redner,et al.  Mixture densities, maximum likelihood, and the EM algorithm , 1984 .

[5]  James E. Smith Moment Methods for Decision Analysis , 1993 .

[6]  J. Douglas Faires,et al.  Numerical Analysis , 1981 .

[7]  David J. Spiegelhalter,et al.  Local computations with probabilities on graphical structures and their application to expert systems , 1990 .

[8]  Ross D. Shachter Evaluating Influence Diagrams , 1986, Oper. Res..

[9]  James Kelly,et al.  AutoClass: A Bayesian Classification System , 1993, ML.

[10]  S. Lauritzen Propagation of Probabilities, Means, and Variances in Mixed Graphical Association Models , 1992 .

[11]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[12]  A. F. Smith,et al.  Statistical analysis of finite mixture distributions , 1986 .

[13]  Kristian G. Olesen,et al.  An algebra of bayesian belief universes for knowledge-based systems , 1990, Networks.

[14]  C. Robert Kenley,et al.  Gaussian influence diagrams , 1989 .

[15]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems , 1988 .

[16]  John E. Shore,et al.  Relative Entropy, Probabilistic Inference, and AI , 1985, UAI.

[17]  D. Keefer Certainty Equivalents for Three-Point Discrete-Distribution Approximations , 1994 .

[18]  Richard A. Johnson,et al.  The Large-Sample Behavior of Transformations to Normality , 1980 .

[19]  A. Cohen,et al.  Estimation in Mixtures of Two Normal Distributions , 1967 .

[20]  A. C. Miller,et al.  Discrete Approximations of Probability Distributions , 1983 .

[21]  H. Brachinger,et al.  Decision analysis , 1997 .

[22]  C. Robert Kenley INFLUENCE DIAGRAM MODELS WITH CONTINUOUS VARIABLES , 1986 .

[23]  Ross D. Shachter,et al.  Simulation Approaches to General Probabilistic Inference on Belief Networks , 2013, UAI.

[24]  Curtis F. Gerald,et al.  APPLIED NUMERICAL ANALYSIS , 1972, The Mathematical Gazette.

[25]  D. Cox,et al.  An Analysis of Transformations , 1964 .

[26]  R. Howard Proximal Decision Analysis , 1971 .

[27]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.