Bayesian estimation of a decision using information theory

The problem of modeling the revision of the information of a decision maker based on the information of the expert sources is considered. The basic model assumes that the information of the decision maker and expert sources is in the form of the probability mass functions. The modeling approach is Bayesian estimation, which relies on Kullback entropy and Shannon entropy for information measurement, and produces a unique solution. Modeling of the problem not only considers information about the statistical dependence of the expert sources, but also uses information to measure the quality and importance of the individual expert sources in the form of rank ordering. The outcome shows that the effects of the dependence and rank ordering of the expert sources on the final decision cannot be isolated, In a special case where this isolation is possible, the effect of rank ordering decreases with the increase in the value of the correlation coefficient from -1 to +1, and the effect of the correlation never exceeds the effect of rank ordering. Sensitivity analysis is performed to explore other properties of the model related to the influence of the decision maker and expert sources. Extensions of the basic modeling to group decision making, group consensus, and mean value information are presented.

[1]  S. Ledermann Kullback S. — Information Theory and Statistics , 1962 .

[2]  E. Soofi Capturing the Intangible Concept of Information , 1994 .

[3]  Myron Tribus The Principle of Maximum Entropy , 1969 .

[4]  M. Degroot,et al.  Probability and Statistics , 2021, Examining an Operational Approach to Teaching Probability.

[5]  J. Bernardo Reference Posterior Distributions for Bayesian Inference , 1979 .

[6]  Donald E. Brown,et al.  A weak law of large numbers for rare events , 1986 .

[7]  I. N. Sanov On the probability of large deviations of random variables , 1958 .

[8]  Mehdi Mostaghimi,et al.  Combining ranked mean value forecasts , 1996 .

[9]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.

[10]  Robert L. Smith,et al.  Assessing Risks Through the Determination of Rare Event Probabilities , 1982, Oper. Res..

[11]  Peter A. Morris,et al.  Combining Expert Judgments: A Bayesian Approach , 1977 .

[12]  R. Bordley A Multiplicative Formula for Aggregating Probability Assessments , 1982 .

[13]  Christian Genest,et al.  Combining Probability Distributions: A Critique and an Annotated Bibliography , 1986 .

[14]  Robert L. Winkler,et al.  Aggregating Point Estimates: A Flexible Modeling Approach , 1993 .

[15]  A. Tversky,et al.  Judgment under Uncertainty: Heuristics and Biases , 1974, Science.

[16]  Solomon Kullback,et al.  Information Theory and Statistics , 1960 .

[17]  R. Clemen Combining forecasts: A review and annotated bibliography , 1989 .

[18]  Dennis V. Lindley,et al.  Reconciliation of Probability Distributions , 1983, Oper. Res..

[19]  I. J. Myung,et al.  Maximum Entropy Aggregation of Expert Predictions , 1996 .

[20]  William B. Levy,et al.  Maximum entropy aggregation of individual opinions , 1994, IEEE Trans. Syst. Man Cybern..

[21]  Stephen Wolfram,et al.  Mathematica: a system for doing mathematics by computer (2nd ed.) , 1991 .

[22]  R. L. Winkler Combining Probability Distributions from Dependent Information Sources , 1981 .

[23]  A. Hobson A new theorem of information theory , 1969 .

[24]  Peter A. Morris,et al.  Decision Analysis Expert Use , 1974 .