Associative classification is a promising technique for the generation of highly precise classifiers. Previous works propose several clever techniques to prune the huge set of generated rules, with the twofold aim of selecting a small set of high quality rules, and reducing the chance of overfitting. In this paper, we argue that pruning should be reduced to a minimum and that the availability of a large rule base may improve the precision of the classifier without affecting its performance. In L/sup 3/ (Live and Let Live), a new algorithm for associative classification, a lazy pruning technique iteratively discards all rules that only yield wrong case classifications. Classification is performed in two steps. Initially, rules which have already correctly classified at least one training case, sorted by confidence, are considered If the case is still unclassified, the remaining rules (unused during the training phase) are considered, again sorted by confidence. Extensive experiments on 26 databases from the UCI machine learning database repository show that L/sup 3/ improves the classification precision with respect to previous approaches.
[1]
Ke Wang,et al.
Growing decision trees on support-less association rules
,
2000,
KDD '00.
[2]
Jian Pei,et al.
CMAR: accurate and efficient classification based on multiple class-association rules
,
2001,
Proceedings 2001 IEEE International Conference on Data Mining.
[3]
Ramakrishnan Srikant,et al.
Fast Algorithms for Mining Association Rules in Large Databases
,
1994,
VLDB.
[4]
Wynne Hsu,et al.
Integrating Classification and Association Rule Mining
,
1998,
KDD.
[5]
Jinyan Li,et al.
CAEP: Classification by Aggregating Emerging Patterns
,
1999,
Discovery Science.
[6]
Ke Wang,et al.
Mining confident rules without support requirement
,
2001,
CIKM '01.
[7]
Jian Pei,et al.
Mining frequent patterns without candidate generation
,
2000,
SIGMOD 2000.
[8]
Yiming Ma,et al.
Improving an Association Rule Based Classifier
,
2000,
PKDD.