A Comparative Performance Study on Hybrid Swarm Model for Micro array Data

Cancer classification based on microarray data is an important problem. Prediction models are used for classification which helps the diagnosis procedures to improve and aid in the physician’s effort. A hybrid swarm model for microarray data is proposed for performance evaluation based on Nature-inspired metaheuristic algorithms. Firefly Algorithm (FA) is the most powerful algorithms for optimization used for multimodal applications. In this paper a Flexible Neural Tree (FNT) model for microarray data is constructed using Nature-inspired algorithms. The FNT structure is developed using the Ant Colony Optimization (ACO) and the parameters embedded in the neural tree are optimized by Firefly Algorithm (FA). FA is superior to the existing metaheuristic algorithm and solves multimodal optimization problems. In this research, comparisons are done with the proposed model for evaluating its performance to find the appropriate model in terms of accuracy and error rate.

[1]  Hogler Kirschner,et al.  Reasoning on domain knowledge level in human-computer interaction , 1994 .

[2]  Jiwen Dong,et al.  Time-series forecasting using flexible neural tree model , 2005, Inf. Sci..

[3]  Reinald Hillebrand,et al.  Neural networks for HREM image analysis , 2000, Inf. Sci..

[4]  Xin Yao,et al.  A new evolutionary system for evolving artificial neural networks , 1997, IEEE Trans. Neural Networks.

[5]  Xin-She Yang,et al.  Firefly algorithm, stochastic test functions and design optimisation , 2010, Int. J. Bio Inspired Comput..

[6]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[7]  X. Yao Evolving Artificial Neural Networks , 1999 .

[8]  Xin-She Yang,et al.  Firefly Algorithms for Multimodal Optimization , 2009, SAGA.

[9]  Xin-She Yang,et al.  Firefly Algorithm, Lévy Flights and Global Optimization , 2010, SGAI Conf..

[10]  Jean-Pierre Nadal,et al.  Study of a Growth Algorithm for a Feedforward Network , 1989, Int. J. Neural Syst..

[11]  Jui-Hong Horng,et al.  Neural Adaptive Tracking Control of a DC Motor , 1999, Inf. Sci..

[12]  Rudy Setiono,et al.  Use of a quasi-Newton method in a feedforward neural network construction algorithm , 1995, IEEE Trans. Neural Networks.

[13]  Bo Yang,et al.  Feature selection and classification using flexible neural tree , 2006, Neurocomputing.

[14]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[15]  Byoung-Tak Zhang,et al.  Evolutionary Induction of Sparse Neural Trees , 1997, Evolutionary Computation.

[16]  Jiwen Dong,et al.  Evolving Flexible Neural Networks Using Ant Programming and PSO Algorithm , 2004, ISNN.

[17]  Xiaoou Li,et al.  Dynamic system identification via recurrent multilayer perceptron , 2002, Inf. Sci..

[18]  Marzuki Khalid,et al.  Neuro-Control Applications , 1996 .

[19]  Slawomir Zak,et al.  Firefly Algorithm for Continuous Constrained Optimization Tasks , 2009, ICCCI.

[20]  Peter J. Angeline,et al.  An evolutionary algorithm that constructs recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[21]  Marzuki Khalid,et al.  Neuro-control and its applications , 1996 .

[22]  Jiwen Dong,et al.  Nonlinear System Modelling Via Optimal Design Of Neural Trees , 2004, Int. J. Neural Syst..

[23]  Risto Miikkulainen,et al.  Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.

[24]  Alaa F. Sheta,et al.  Time-series forecasting using GA-tuned radial basis functions , 2001, Inf. Sci..