The minimal reduct of features in the Rough Set theory is a criterion to select the best feature subset based on it's ability to discriminate objects. In this paper, a multi-class features ensemble learning algorithm based on feature reduct is presented. The algorithm maintains a weight distribution in train set, which is used to compute minimal approximate reduct of features in each iteration of algorithm. Weak classifiers are constructed from the minimal approximate reduct and the weight distribution is updated according to examples in the train set which have been misclassified. The ensemble classifier are constructed by weighted votes of all weak classifiers. The results of testing in several dataset show that the algorithm has high accuracy of prediction and strong ability of generalization.
[1]
Yoav Freund,et al.
A decision-theoretic generalization of on-line learning and an application to boosting
,
1997,
EuroCOLT.
[2]
Tsau Young Lin,et al.
Rough Set Methods and Applications
,
2000
.
[3]
Leo Breiman,et al.
Bagging Predictors
,
1996,
Machine Learning.
[4]
Thomas G. Dietterich,et al.
Solving Multiclass Learning Problems via Error-Correcting Output Codes
,
1994,
J. Artif. Intell. Res..
[5]
Francis K. H. Quek,et al.
Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets
,
2003,
Pattern Recognit..
[6]
L. A. Smith,et al.
Feature Subset Selection: A Correlation Based Filter Approach
,
1997,
ICONIP.