Mixtures of Heterogeneous Experts

No single machine learning algorithm is most accurate for all problems due to the effect of an algorithm's inductive bias. Research has shown that a combination of experts of the same type, referred to as a mixture of homogeneous experts, can increase the accuracy of ensembles by reducing the adverse effect of an algorithm's inductive bias. However, the predictive power of a mixture of homogeneous experts is still limited by the inductive bias of the algorithm that makes up the mixture. For this reason, combinations of different machine learning algorithms, referred to as a mixture of heterogeneous experts, has been proposed to take advantage of the strengths of different machine learning algorithms and to reduce the adverse effects of the inductive biases of these algorithms. This paper presents a mixture of heterogeneous experts, and evaluates its performance to that of a number of mixtures of homogeneous experts on a set of classification problems. The results indicate that a mixture of heterogeneous experts aggregates the advantages of experts, increasing the accuracy of predictions. The mixture of heterogeneous experts not only outperformed all homogeneous ensembles on two of the datasets, but also achieved the best overall accuracy rank across the various datasets.

[1]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[2]  André L. V. Coelho,et al.  On the evolutionary design of heterogeneous Bagging models , 2010, Neurocomputing.

[3]  Tom M. Mitchell,et al.  The Need for Biases in Learning Generalizations , 2007 .

[4]  D. Opitz,et al.  Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..

[5]  Kuo-Wei Hsu,et al.  Hybrid ensembles of decision trees and artificial neural networks , 2012, 2012 IEEE International Conference on Computational Intelligence and Cybernetics (CyberneticsCom).

[6]  Kuo-Wei Hsu,et al.  A Theoretical Analysis of Why Hybrid Ensembles Work , 2017, Comput. Intell. Neurosci..

[7]  Oleksandr Makeyev,et al.  Neural network with ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[8]  David C. Yen,et al.  Predicting stock returns by classifier ensembles , 2011, Appl. Soft Comput..

[9]  OpitzDavid,et al.  Popular ensemble methods , 1999 .