On the Use of Min-Based Revision Under Uncertain Evidence for Possibilistic Classifiers

Possibilitic networks, which are compact representa- tions of possibility distributions, are powerful tools for representing and reasoning with uncertain and incomplete knowledge. According to the operator conditioning is based on, there are two possibilistic settings: quantitative and qualitative. This paper deals with quali- tative possibilistic network classifiers under uncertain inputs. More precisely, we first present and analyze Jeffrey's rule for revising pos- sibility distributions by uncertain observations in the qualitative set- ting. Then, we propose an efficient algorithm for revising possibility distributions encoded by naive possibilistic networks for classifica- tion purposes. This algorithm consists in a series of efficient and equivalent transformations of initial naive possibilistic classifiers. Keywords— Min-based possibilistic networks, classification un- der uncertain inputs