Complexity Approximation of Classification Task for Large Dataset Ensemble Artificial Neural Networks

In this paper, operational and complexity analysis model for ensemble Artificial Neural Networks (ANN) multiple classifiers are investigated. The main idea behind this, is lie on large dataset classification complexity and burden are to be moderated by using partitioning for parallel tasks and combining them to enhance the capability of a classifier. The complexity of the single ANN and ensemble ANN are obtained from the estimates of upper bounds of converged functional error with the partitioning of dataset. The estimates derived using Apriori method shows that the use of an ensemble ANN with different approach is feasible where such problem with a high number of inputs and classes can be solved with time complexity of \( {\text{O}}\left( {{\text{n}}^{\text{k}} } \right) \) for some \( {\text{k}} \), which is a type of polynomial. This result is in line with the importance of good performance achieved by diversity rule applied with the use of reordering technique. As a conclusion, an ensemble heterogeneous ANN classifier is practical and relevance to theoretical and experimental of combiners for ensemble ANN classifier systems for large dataset.

[1]  Hamid Parvin,et al.  To improve the quality of cluster ensembles by selecting a subset of base clusters , 2014, J. Exp. Theor. Artif. Intell..

[2]  Deepak Garg,et al.  Complexity Analysis in Heterogeneous System , 2009, Comput. Inf. Sci..

[3]  Emilio Corchado,et al.  A survey of multiple classifier systems as hybrid systems , 2014, Inf. Fusion.

[4]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[5]  Michael Egmont-Petersen,et al.  Image processing with neural networks - a review , 2002, Pattern Recognit..

[6]  Sorin Babii Performance Evaluation for Training a Distributed BackPropagation Implementation , 2007, 2007 4th International Symposium on Applied Computational Intelligence and Informatics.

[7]  Joaquín Torres-Sospedra,et al.  Introducing Reordering Algorithms to Classic Well-Known Ensembles to Improve Their Performance , 2011, ICONIP.

[8]  M. W. Shields,et al.  A theoretical framework for multiple neural network systems , 2008, Neurocomputing.

[9]  Terry Windeatt,et al.  Accuracy/Diversity and Ensemble MLP Classifier Design , 2006, IEEE Transactions on Neural Networks.

[10]  Alexander J. Smola,et al.  Parallelized Stochastic Gradient Descent , 2010, NIPS.

[11]  Kin Keung Lai,et al.  Credit risk assessment with a multistage neural network ensemble learning approach , 2008, Expert Syst. Appl..

[12]  Luca Maria Gambardella,et al.  Deep, Big, Simple Neural Nets for Handwritten Digit Recognition , 2010, Neural Computation.

[13]  Hamidah Ibrahim,et al.  A Survey: Clustering Ensembles Techniques , 2009 .

[14]  Joaquín Torres Sospedra Ensembles of Artificial Neural Networks: Analysis and Development of Design Methods , 2011 .

[15]  Johannes R. Sveinsson,et al.  A classifier ensemble based on fusion of support vector machines for classifying hyperspectral data , 2010 .