The relative performance of diierent methods for classiier learning varies across domains. Some recent Instance Based Learning (IBL) methods, such as IB1-MVDM* 10 , use similarity measures based on conditional class probabilities. These probabilities are a key component of Naive Bayes methods. Given this commonality of approach, it is of interest to consider how the diierences between the two methods are linked to their relative performance in diierent domains. Here we interpret Naive Bayes in an IBL like framework, identifying diierences between Naive Bayes and IB1-MVDM* in this framework. Experiments on variants of IB1-MVDM* that lie between it and Naive Bayes in the framework are conducted on sixteen domains. The results strongly suggest that the relative performance of Naive Bayes and IB1-MVDM* is linked to the extent to which each class can be satisfactorily represented by a single instance in the IBL framework. However, this is not the only factor that appears signiicant.
[1]
David L. Waltz,et al.
Toward memory-based reasoning
,
1986,
CACM.
[2]
D. C. Howell.
Statistical Methods for Psychology
,
1987
.
[3]
Bojan Cestnik,et al.
Estimating Probabilities: A Crucial Task in Machine Learning
,
1990,
ECAI.
[4]
J. Ross Quinlan,et al.
C4.5: Programs for Machine Learning
,
1992
.
[5]
Usama M. Fayyad,et al.
Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning
,
1993,
IJCAI.
[6]
Kai Ming Ting,et al.
Discretization of Continuous-Valued Attributes and Instance-Based Learning
,
1994
.
[7]
David W. Aha,et al.
Towards a Better Understanding of Memory-based Reasoning Systems
,
1994,
ICML.