The idea of using a team of weak learners to learn a dataset is a successful one in literature. In this paper, we explore a recursive incremental approach to ensemble learning. In this paper, patterns are clustered according to the output space of the problem, i.e., natural clusters are formed based on patterns belonging to each class. A combinatorial optimization problem is therefore formed, which is solved using evolutionary algorithms. The evolutionary algorithms identify the "easy" and the "difficult" clusters in the system. The removal of the easy patterns then gives way to the focused learning of the more complicated patterns. Incrementally, neural networks are added to the ensemble to focus on solving successively difficult examples. The problem therefore becomes recursively simpler. Over fitting is overcome by using a set of validation patterns along with a pattern distributor. An algorithm is also proposed to use the pattern distributor to determine the optimal number of recursions and hence the optimal number of weak learners for the problem. In this paper, we show that the generalization accuracy of the proposed algorithm is always better than that of the underlying weak learner. Empirical studies show generally good performance when compared to other state-of- the-art methods.
[1]
Andries Petrus Engelbrecht,et al.
Supervised Training Using an Unsupervised Approach to Active Learning
,
2002,
Neural Processing Letters.
[2]
Wolfgang Banzhaf,et al.
Dynamic Subset Selection Based on a Fitness Case Topology
,
2004,
Evolutionary Computation.
[3]
Anil K. Jain,et al.
Data clustering: a review
,
1999,
CSUR.
[4]
Steven Guan,et al.
Output partitioning of neural networks
,
2005,
Neurocomputing.
[5]
Chunyu Bao,et al.
Task Decomposition Using Pattern Distributor
,
2004
.
[6]
Steven Guan,et al.
Parallel growing and training of neural networks using output parallelism
,
2002,
IEEE Trans. Neural Networks.
[7]
K. Ramanathan,et al.
Recursive Self Organizing Maps with Hybrid Clustering
,
2006,
2006 IEEE Conference on Cybernetics and Intelligent Systems.
[8]
Mikko Lehtokangas.
Modelling with constructive backpropagation
,
1999,
Neural Networks.
[9]
Robert E. Schapire,et al.
A Brief Introduction to Boosting
,
1999,
IJCAI.
[10]
Hajime Kita,et al.
A parallel and modular multi-sieving neural network architecture for constructive learning
,
1995
.