Chaotic fractal walk trainer for sonar data set classification using multi-layer perceptron neural network and its hardware implementation

Abstract First, this study proposes the use of the newly developed Stochastic Fractal Search (SFS) algorithm for training MLP NNs to design the evolutionary classifier. Evolutionary classifiers, often experience problems of slow convergence speed, trapping in local minima, and non-real-time classification. This paper also use four chaotic maps to improve the performance of the SFS. This modified version of SFS has been called Chaotic Fractal Walk Trainer (CFWT). To assess the performance of the proposed classifiers, these networks will be evaluated using the two benchmark datasets and a high-dimensional practical sonar dataset. For endorsement, the results are compared to four popular meta-heuristics trainers. The results show that new classifiers suggest better performance than the other benchmark algorithms, in terms of entrapment in local minima, classification accuracy, and convergence speed. This paper also implements the designed classifier on the Filed Programmable Field Array (FPGA) substrate for testing the real-time processing ability of the proposed method. The results of the real application prove that the designed classifiers are applicable to high-dimension challenging problems with unknown search spaces.

[1]  Tayfun Dede,et al.  Estimates of energy consumption in Turkey using neural networks with the teaching–learning-based optimization algorithm , 2014 .

[2]  Jinde Cao,et al.  Exponential H∞ filtering analysis for discrete-time switched neural networks with random delays using sojourn probabilities , 2016, Science China Technological Sciences.

[3]  Hamid Salimi,et al.  Stochastic Fractal Search: A powerful metaheuristic algorithm , 2015, Knowl. Based Syst..

[4]  E. Won,et al.  A hardware implementation of artificial neural networks using field programmable gate arrays , 2007, physics/0703041.

[5]  Navid Razmjooy,et al.  A multi layer perceptron neural network trained by Invasive Weed Optimization for potato color image segmentation. , 2012 .

[6]  Benoit B. Mandelbrot,et al.  Fractal Geometry of Nature , 1984 .

[7]  Hossam Faris,et al.  Optimizing the Learning Process of Feedforward Neural Networks Using Lightning Search Algorithm , 2016, Int. J. Artif. Intell. Tools.

[8]  P. Grassberger,et al.  Characterization of Strange Attractors , 1983 .

[9]  Urbashi Mitra,et al.  Robust Object Classification in Underwater Sidescan Sonar Images by Using Reliability-Aware Fusion of Shadow Features , 2015, IEEE Journal of Oceanic Engineering.

[10]  Amir Hossein Gandomi,et al.  Chaotic Krill Herd algorithm , 2014, Inf. Sci..

[11]  Abdelhak M. Zoubir,et al.  Unified Design of a Feature-Based ADAC System for Mine Hunting Using Synthetic Aperture Sonar , 2014, IEEE Transactions on Geoscience and Remote Sensing.

[12]  Jinde Cao,et al.  Adaptive complete synchronization of two identical or different chaotic (hyperchaotic) systems with fully unknown parameters. , 2005, Chaos.

[13]  Khishe Mohammad,et al.  Classification of Sonar Targets Using OMKC, Genetic Algorithms and Statistical Moments , 2016 .

[14]  Nicolaos B. Karayiannis,et al.  Reformulated radial basis neural networks trained by gradient descent , 1999, IEEE Trans. Neural Networks.

[15]  Andrew Lewis,et al.  Grey Wolf Optimizer , 2014, Adv. Eng. Softw..

[16]  Dervis Karaboga,et al.  Artificial Bee Colony (ABC) Optimization Algorithm for Training Feed-Forward Neural Networks , 2007, MDAI.

[17]  M. Khishe,et al.  Neural Network Trained by Biogeography-Based Optimizer with Chaos for Sonar Data Set Classification , 2017, Wireless Personal Communications.

[18]  M. Barnsley,et al.  Iterated function systems and the global construction of fractals , 1985, Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences.

[19]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[20]  Victor O. K. Li,et al.  Evolutionary artificial neural network based on Chemical Reaction Optimization , 2011, 2011 IEEE Congress of Evolutionary Computation (CEC).

[21]  Xiaodong Cui,et al.  Data Augmentation for Deep Neural Network Acoustic Modeling , 2015, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[22]  Andrew Lewis,et al.  Let a biogeography-based optimizer train your Multi-Layer Perceptron , 2014, Inf. Sci..

[23]  Mohammad Reza Mosavi,et al.  Training Radial Basis Function Neural Network using Stochastic Fractal Search Algorithm to Classify Sonar Dataset , 2017 .

[24]  L. Sander,et al.  Diffusion-limited aggregation , 1983 .

[25]  Siti Zaiton Mohd Hashim,et al.  Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm , 2012, Appl. Math. Comput..

[26]  Abdelhak M. Zoubir,et al.  Contributions to Automatic Target Recognition Systems for Underwater Mine Classification , 2015, IEEE Transactions on Geoscience and Remote Sensing.

[27]  Sirapat Chiewchanwattana,et al.  An improved grey wolf optimizer for training q-Gaussian Radial Basis Functional-link nets , 2014, 2014 International Computer Science and Engineering Conference (ICSEC).

[28]  Hari Mohan Dubey,et al.  Multi-Objective Optimal Dispatch Solution of Solar-Wind-Thermal System Using Improved Stochastic Fractal Search Algorithm , 2016 .

[29]  Przemyslaw Prusinkiewicz,et al.  Graphical applications of L-systems , 1986 .

[30]  Christian Blum,et al.  An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training , 2007, Neural Computing and Applications.

[31]  Stephen K. Pearce,et al.  Sharpening Sidescan Sonar Images for Shallow-Water Target and Habitat Classification With a Vertically Stacked Array , 2013, IEEE Journal of Oceanic Engineering.

[32]  Robert C. Green,et al.  Training neural networks using Central Force Optimization and Particle Swarm Optimization: Insights and comparisons , 2012, Expert Syst. Appl..

[33]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[34]  Mohammad Reza Mosavi,et al.  Classification of sonar data set using neural network trained by Gray Wolf Optimization , 2016 .

[35]  DeLiang Wang,et al.  Neural Network Based Pitch Tracking in Very Noisy Speech , 2014, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[36]  Hossam Faris,et al.  Training feedforward neural networks using multi-verse optimizer for binary classification problems , 2016, Applied Intelligence.

[37]  F. Wilcoxon Individual Comparisons by Ranking Methods , 1945 .

[38]  Yang Cao,et al.  A New Hybrid Chaotic Map and Its Application on Image Encryption and Hiding , 2013 .

[39]  Joni-Kristian Kämäräinen,et al.  Differential Evolution Training Algorithm for Feed-Forward Neural Networks , 2003, Neural Processing Letters.

[40]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[41]  Yang Cao,et al.  A Novel Chaotic Map and an Improved Chaos-Based Image Encryption Scheme , 2014, TheScientificWorldJournal.

[42]  Y. Ho,et al.  Simple Explanation of the No-Free-Lunch Theorem and Its Implications , 2002 .

[43]  James W. Cannon,et al.  Finite subdivision rules , 2001 .