Learning and Updating of Uncertainty in Dirichlet Models

In this paper we analyze the problem of learning and updating of uncertainty in Dirichlet models, where updating refers to determining the conditional distribution of a single variable when some evidence is known. We first obtain the most general family of prior-posterior distributions which is conjugate to a Dirichlet likelihood and we identify those hyperparameters that are influenced by data values. Next, we describe some methods to assess the prior hyperparameters and we give a numerical method to estimate the Dirichlet parameters in a Bayesian context, based on the posterior mode. We also give formulas for updating uncertainty by determining the conditional probabilities of single variables when the values of other variables are known. A time series approach is presented for dealing with the cases in which samples are not identically distributed, that is, the Dirichlet parameters change from sample to sample. This typically occurs when the population is observed at different times. Finally, two examples are given that illustrate the learning and updating processes and the time series approach.

[1]  Judea Pearl,et al.  A Constraint-Propagation Approach to Probabilistic Reasoning , 1985, UAI.

[2]  Gernot D. Kleiter,et al.  Bayesian Diagnosis in Expert Systems , 1992, Artif. Intell..

[3]  Judea Pearl,et al.  Fusion, Propagation, and Structuring in Belief Networks , 1986, Artif. Intell..

[4]  Alan Bundy,et al.  Symbolic and Quantitative Approaches to Reasoning and Uncertainty , 1993 .

[5]  I. Good On the Application of Symmetric Dirichlet Distributions and their Mixtures to Contingency Tables , 1976 .

[6]  Gregory F. Cooper,et al.  A Bayesian method for the induction of probabilistic networks from data , 1992, Machine Learning.

[7]  David Maxwell Chickering,et al.  Learning Bayesian Networks: The Combination of Knowledge and Statistical Data , 1994, Machine Learning.

[8]  Enrique F. Castillo,et al.  Parametric Structure of Probabilities in Bayesian Networks , 1995, ECSQARU.

[9]  Judea Pearl,et al.  Probabilistic reasoning using graphs , 1987, International Conference on Information Processing and Management of Uncertainty.

[10]  David Heckerman,et al.  A Characterization of the Dirichlet Distribution with Application to Learning Bayesian Networks , 1995, UAI.

[11]  José Manuel Gutiérrez,et al.  Expert Systems and Probabiistic Network Models , 1996 .

[12]  David Heckerman,et al.  Learning Gaussian Networks , 1994, UAI.

[13]  David Heckerman,et al.  An empirical comparison of three inference methods , 2013, UAI.

[14]  William H. Press,et al.  The Art of Scientific Computing Second Edition , 1998 .

[15]  F. A. Seiler,et al.  Numerical Recipes in C: The Art of Scientific Computing , 1989 .

[16]  David J. Spiegelhalter,et al.  Local computations with probabilities on graphical structures and their application to expert systems , 1990 .

[17]  B. Arnold,et al.  Priors with Convenient Posteriors , 1996 .

[18]  Ross D. Shachter,et al.  Global Conditioning for Probabilistic Inference in Belief Networks , 1994, UAI.

[19]  Richard E. Neapolitan,et al.  Investigation of Variances in Belief Networks , 1991, UAI.

[20]  Jr. Charles Ronald Musick Belief network induction , 1994 .

[21]  Enrique F. Castillo,et al.  Expert Systems and Probabilistic Network Models , 1996, Monographs in Computer Science.

[22]  Enrique Castillo,et al.  Expert Systems: Uncertainty and Learning , 1991 .

[23]  Gregory F. Cooper,et al.  The Computational Complexity of Probabilistic Inference Using Bayesian Belief Networks , 1990, Artif. Intell..

[24]  M. Degroot,et al.  Probability and Statistics , 2021, Examining an Operational Approach to Teaching Probability.

[25]  Remco R. Bouckaert,et al.  Properties of Bayesian Belief Network Learning Algorithms , 1994, UAI.

[26]  B. Arnold,et al.  Conjugate Exponential Family Priors For Exponential Family Likelihoods , 1993 .

[27]  R. F.,et al.  Mathematical Statistics , 1944, Nature.