Evidence Combination Based on Credal Belief Redistribution for Pattern Classification

Evidence theory, also called belief function theory, provides an efficient tool to represent and combine uncertain information for pattern classification. Evidence combination can be interpreted, in some applications, as classifier fusion. The sources of evidence corresponding to multiple classifiers usually exhibit different classification qualities, and they are often discounted using different weights before combination. In order to achieve the best possible fusion performance, a new credal belief redistribution (CBR) method is proposed to revise such evidence. The rationale of CBR consists of transferring belief from one class not just to other classes, but also to the associated disjunctions of classes (i.e., meta-classes). As classification accuracy for different objects in a given classifier can also vary, the evidence is revised according to prior knowledge mined from its training neighbors. If the selected neighbors are relatively close to the evidence, a large amount of belief will be discounted for redistribution. Otherwise, only a small fraction of belief will enter the redistribution procedure. An imprecision matrix estimated based on these neighbors is employed to specifically redistribute the discounted beliefs. This matrix expresses the likelihood of misclassification (i.e., the probability of a test pattern belonging to a class different from the one assigned to it by the classifier). In CBR, the discounted beliefs are divided into two parts. One part is transferred between singleton classes, whereas the other is cautiously committed to the associated meta-classes. By doing this, one can efficiently reduce the chance of misclassification by modeling partial imprecision. The multiple revised pieces of evidence are finally combined by the Dempster–Shafer rule to reduce uncertainty and further improve classification accuracy. The effectiveness of CBR is extensively validated on several real datasets from the UCI repository and critically compared with that of other related fusion methods.

[1]  M. Lawera Predictive inference : an introduction , 1995 .

[2]  Ludmila I. Kuncheva,et al.  A Theoretical Study on Six Classifier Fusion Strategies , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Christophe Osswald,et al.  Conflict measure for the discounting operation on belief functions , 2008, 2008 11th International Conference on Information Fusion.

[4]  Glenn Shafer,et al.  A Mathematical Theory of Evidence , 2020, A Mathematical Theory of Evidence.

[5]  Yi Yang,et al.  A new distance-based total uncertainty measure in the theory of belief functions , 2016, Knowl. Based Syst..

[6]  Peter Norvig,et al.  Artificial intelligence - a modern approach, 2nd Edition , 2003, Prentice Hall series in artificial intelligence.

[7]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[8]  Arnaud Martin About Conflict in the Theory of Belief Functions , 2012, Belief Functions.

[9]  Sankaran Mahadevan,et al.  A new decision-making method by incomplete preferences based on evidence distance , 2014, Knowl. Based Syst..

[10]  Zhi-Jie Zhou,et al.  A New Belief Rule Base Model With Attribute Reliability , 2019, IEEE Transactions on Fuzzy Systems.

[11]  Luisa Micó,et al.  Comparison of Classifier Fusion Methods for Classification in Pattern Recognition Tasks , 2006, SSPR/SPR.

[12]  Quan Pan,et al.  A Hybrid Belief Rule-Based Classification System Based on Uncertain Training Data and Expert Knowledge , 2016, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[13]  Quan Pan,et al.  Combination of Classifiers With Optimal Weight Based on Evidential Reasoning , 2018, IEEE Transactions on Fuzzy Systems.

[14]  Yaxin Bi,et al.  The combination of multiple classifiers using an evidential reasoning approach , 2008, Artif. Intell..

[15]  Dustin Boswell,et al.  Introduction to Support Vector Machines , 2002 .

[16]  Gee Wah Ng,et al.  High-level Information Fusion: An Overview , 2013, J. Adv. Inf. Fusion.

[17]  Éloi Bossé,et al.  Measuring ambiguity in the evidence theory , 2006, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[18]  Quan Pan,et al.  A New Incomplete Pattern Classification Method Based on Evidential Reasoning , 2015, IEEE Transactions on Cybernetics.

[19]  Zhiwen Yu,et al.  Hybrid Adaptive Classifier Ensemble , 2015, IEEE Transactions on Cybernetics.

[20]  Quan Pan,et al.  Classifier Fusion With Contextual Reliability Evaluation , 2018, IEEE Transactions on Cybernetics.

[21]  Thierry Denoeux,et al.  A k-nearest neighbor classification rule based on Dempster-Shafer theory , 1995, IEEE Trans. Syst. Man Cybern..

[22]  Thierry Denoeux,et al.  Maximum Likelihood Estimation from Uncertain Data in the Belief Function Framework , 2013, IEEE Transactions on Knowledge and Data Engineering.

[23]  D. Ruta,et al.  An Overview of Classifier Fusion Methods , 2000 .

[24]  Thierry Denoeux,et al.  A neural network classifier based on Dempster-Shafer theory , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[25]  Dong-Ling Xu,et al.  Evidential reasoning rule for evidence combination , 2013, Artif. Intell..

[26]  Songsak Sriboonchitta,et al.  Evaluating and Comparing Soft Partitions: An Approach Based on Dempster–Shafer Theory , 2018, IEEE Transactions on Fuzzy Systems.

[27]  Florentin Smarandache,et al.  Advances and Applications of DSmT for Information Fusion , 2004 .

[28]  Ming Zhao,et al.  Evidential K-NN classification with enhanced performance via optimizing a class of parametric conjunctive t-rules , 2017, Knowl. Based Syst..

[29]  Quan Pan,et al.  Hybrid Classification System for Uncertain Data , 2017, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[30]  Thierry Denoeux,et al.  Refined modeling of sensor reliability in the belief function framework using contextual discounting , 2008, Inf. Fusion.

[31]  Thierry Denoeux,et al.  Classifier fusion in the Dempster-Shafer framework using optimized t-norm based combination rules , 2011, Int. J. Approx. Reason..

[32]  Philippe Smets,et al.  The Combination of Evidence in the Transferable Belief Model , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[33]  Anne-Laure Jousselme,et al.  Distances in evidence theory: Comprehensive survey and generalizations , 2012, Int. J. Approx. Reason..

[34]  D. Griffel Linear programming 2: Theory and extensions , by G. B. Dantzig and M. N. Thapa. Pp. 408. £50.00. 2003 ISBN 0 387 00834 9 (Springer). , 2004, The Mathematical Gazette.

[35]  Stephen J. Wright,et al.  Primal-Dual Interior-Point Methods , 1997 .

[36]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[37]  Yoav Freund,et al.  A more robust boosting algorithm , 2009, 0905.2138.

[38]  Jean Dezert,et al.  On the Validity of Dempster's Fusion Rule and its Interpretation as a Generalization of Bayesian Fusion Rule , 2014, Int. J. Intell. Syst..

[39]  Erik Blasch,et al.  Top ten trends in High-Level Information Fusion , 2012, 2012 15th International Conference on Information Fusion.

[40]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[41]  Weiru Liu,et al.  Analyzing the degree of conflict among belief functions , 2006, Artif. Intell..

[42]  Thierry Denoeux,et al.  Dissimilarity Metric Learning in the Belief Function Framework , 2016, IEEE Transactions on Fuzzy Systems.

[43]  Quan Pan,et al.  Combination of sources of evidence with different discounting factors based on a new dissimilarity measure , 2011, Decis. Support Syst..

[44]  Jean Dezert,et al.  Credal c-means clustering method based on belief functions , 2015, Knowl. Based Syst..

[45]  Sébastien Destercke,et al.  Evaluating Data Reliability: An Evidential Answer with Application to a Web-Enabled Data Warehouse , 2013, IEEE Transactions on Knowledge and Data Engineering.