Influence of kernel clustering on an RBFN

Classical radial basis function network (RBFN) is widely used to process the non-linear separable data sets with the introduction of activation functions. However, the setting of parameters for activation functions is random and the distribution of patterns is not taken into account. To process this issue, some scholars introduce the kernel clustering into the RBFN so that the clustering results are related to the parameters about activation functions. On the base of the original kernel clustering, this study further discusses the influence of kernel clustering on an RBFN when the setting of kernel clustering is changing. The changing involves different kernel-clustering ways [bubble sort (BS) and escape nearest outlier (ENO)], multiple kernel-clustering criteria (static and dynamic) etc. Experimental results validate that with the consideration of distribution of patterns and the changes of setting of kernel clustering, the performance of an RBFN is improved and is more feasible for corresponding data sets. Moreover, though BS always costs more time than ENO, it still brings more feasible clustering results. Furthermore, dynamic criterion always cost much more time than static one, but kernel number derived from dynamic criterion is fewer than the one from static.

[1]  Tomasz Pawlak,et al.  One-class synthesis of constraints for Mixed-Integer Linear Programming with C4.5 decision trees , 2018, Appl. Soft Comput..

[2]  Matteo Gadaleta,et al.  IDNet: Smartphone-based Gait Recognition with Convolutional Neural Networks , 2016, Pattern Recognit..

[3]  Liangxiao Jiang,et al.  Class-specific attribute weighted naive Bayes , 2019, Pattern Recognit..

[4]  Adrian Hilton,et al.  OCEAN: Object-centric arranging network for self-supervised visual representations learning , 2019, Expert Syst. Appl..

[5]  Shuai Li,et al.  A survey on projection neural networks and their applications , 2019, Appl. Soft Comput..

[6]  Hassan Ismkhan,et al.  I-k-means-+: An iterative clustering algorithm based on an enhanced version of the k-means , 2018, Pattern Recognit..

[7]  Jun Ma,et al.  Feed-forward neural network training using sparse representation , 2019, Expert Syst. Appl..

[8]  Jieping Ye,et al.  Generalized Low Rank Approximations of Matrices , 2005, Machine Learning.

[9]  Piotr A. Kowalski,et al.  Weighted probabilistic neural network , 2018, Inf. Sci..

[10]  Xu Chen,et al.  A novel reinforcement learning algorithm for virtual network embedding , 2018, Neurocomputing.

[11]  Yi Yu,et al.  Time delay Chebyshev functional link artificial neural network , 2019, Neurocomputing.

[12]  Haibo He,et al.  Deep associative neural network for associative memory based on unsupervised representation learning , 2019, Neural Networks.

[13]  Zebin Yang,et al.  Interval-valued data prediction via regularized artificial neural network , 2018, Neurocomputing.

[14]  Chi-Hyuck Jun,et al.  Instance categorization by support vector machines to adjust weights in AdaBoost for imbalanced data classification , 2017, Inf. Sci..

[15]  Wen J. Li,et al.  An assertive reasoning method for emergency response management based on knowledge elements C4.5 decision tree , 2019, Expert Syst. Appl..

[16]  Sasan H. Alizadeh,et al.  Mixture of latent multinomial naive Bayes classifier , 2018, Appl. Soft Comput..

[17]  Tehseen Zia,et al.  Hierarchical recurrent highway networks , 2019, Pattern Recognit. Lett..

[18]  Monson H. Hayes,et al.  Training neural network classifiers through Bayes risk minimization applying unidimensional Parzen windows , 2018, Pattern Recognit..

[19]  Michal Strzelecki,et al.  Hybrid no-propagation learning for multilayer neural networks , 2018, Neurocomputing.

[20]  Eytan Modiano,et al.  Learning algorithms for scheduling in wireless networks with unknown channel statistics , 2019, Ad Hoc Networks.

[21]  Ashish Ghosh,et al.  Scaled and oriented object tracking using ensemble of multilayer perceptrons , 2018, Appl. Soft Comput..