Conditioning in Dempster-Shafer Theory: Prediction vs. Revision

We recall the existence of two methods for conditioning belief functions due to Dempster: one, known as Dempster conditioning, that applies Bayesian conditioning to the plausibility function and one that performs a sensitivity analysis on a conditional probability. We recall that while the first one is dedicated to revising a belief function, the other one is tailored to a prediction problem when the belief function is a statistical model. We question the use of Dempster conditioning for prediction tasks in Smets generalized Bayes theorem approach to the modeling of statistical evidence and propose a modified version of it, that is more informative than the other conditioning rule.

[1]  Glenn Shafer,et al.  A Mathematical Theory of Evidence , 2020, A Mathematical Theory of Evidence.

[2]  Itzhak Gilboa,et al.  Updating Ambiguous Beliefs , 1992, TARK.

[3]  Philippe Smets,et al.  Belief functions: The disjunctive rule of combination and the generalized Bayesian theorem , 1993, Int. J. Approx. Reason..

[4]  Didier Dubois,et al.  Representing partial ignorance , 1996, IEEE Trans. Syst. Man Cybern. Part A.

[5]  Zoltan Domotor Probability kinematics, conditionals, and entropy principles , 1985, Synthese.

[6]  Glenn Shafer,et al.  Belief Functions and Parametric Models , 1982, Classic Works of the Dempster-Shafer Theory of Belief Functions.

[7]  Philippe Smets,et al.  Belief functions: The disjunctive rule of combination and the generalized Bayesian theorem , 1993, Int. J. Approx. Reason..

[8]  Arthur P. Dempster,et al.  Upper and Lower Probabilities Induced by a Multivalued Mapping , 1967, Classic Works of the Dempster-Shafer Theory of Belief Functions.

[9]  Jean-Yves Jaffray Bayesian Updating and Belief Functions , 2008, Classic Works of the Dempster-Shafer Theory of Belief Functions.

[10]  Thierry Denoeux,et al.  Statistical Inference with Belief Functions and Possibility Measures: A Discussion of Basic Assumptions , 2010, SMPS.