Multicolumn RBF Network

This paper proposes the multicolumn RBF network (MCRN) as a method to improve the accuracy and speed of a traditional radial basis function network (RBFN). The RBFN, as a fully connected artificial neural network (ANN), suffers from costly kernel inner-product calculations due to the use of many instances as the centers of hidden units. This issue is not critical for small datasets, as adding more hidden units will not burden the computation time. However, for larger datasets, the RBFN requires many hidden units with several kernel computations to generalize the problem. The MCRN mechanism is constructed based on dividing a dataset into smaller subsets using the k-d tree algorithm. $N$ resultant subsets are considered as separate training datasets to train $N$ individual RBFNs. Those small RBFNs are stacked in parallel and bulged into the MCRN structure during testing. The MCRN is considered as a well-developed and easy-to-use parallel structure, because each individual ANN has been trained on its own subsets and is completely separate from the other ANNs. This parallelized structure reduces the testing time compared with that of a single but larger RBFN, which cannot be easily parallelized due to its fully connected structure. Small informative subsets provide the MCRN with a regional experience to specify the problem instead of generalizing it. The MCRN has been tested on many benchmark datasets and has shown better accuracy and great improvements in training and testing times compared with a single RBFN. The MCRN also shows good results compared with those of some machine learning techniques, such as the support vector machine and k-nearest neighbors.

[1]  Luis M. Candanedo,et al.  Accurate occupancy detection of an office room from light, temperature, humidity and CO2 measurements using statistical learning models , 2016 .

[2]  Ahmad Salman,et al.  Learning Speaker-Specific Characteristics With a Deep Neural Architecture , 2011, IEEE Transactions on Neural Networks.

[3]  Marco Wiering,et al.  Temporal Difference Learning for the Game Tic-Tac-Toe 3D: Applying Structure to Neural Networks , 2015, 2015 IEEE Symposium Series on Computational Intelligence.

[4]  Hao Yu,et al.  Fast and Efficient Second-Order Method for Training Radial Basis Function Networks , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[5]  Franco Scarselli,et al.  On the Complexity of Neural Network Classifiers: A Comparison Between Shallow and Deep Architectures , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[6]  Jiawei Han,et al.  Clustered Support Vector Machines , 2013, AISTATS.

[7]  Rongrong Ji,et al.  Learning High-Level Feature by Deep Belief Networks for 3-D Model Retrieval and Recognition , 2014, IEEE Transactions on Multimedia.

[8]  Patricio A. Vela,et al.  Kernel Map Compression for Speeding the Execution of Kernel-Based Methods , 2011, IEEE Transactions on Neural Networks.

[9]  Pawel Strumillo,et al.  Kernel orthonormalization in radial basis function neural networks , 1997, IEEE Trans. Neural Networks.

[10]  Brendan McCane,et al.  Deep Networks are Effective Encoders of Periodicity , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[12]  Guang-Bin Huang,et al.  Neuron selection for RBF neural network classifier based on data structure preserving criterion , 2005, IEEE Transactions on Neural Networks.

[13]  Hao Yu,et al.  An Incremental Design of Radial Basis Function Networks , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[14]  Leila Baccour,et al.  Proposition of a classification system “β − LS − SVM” and its application to medical data sets , 2014, 2014 6th International Conference of Soft Computing and Pattern Recognition (SoCPaR).

[15]  Jon Louis Bentley,et al.  Multidimensional Binary Search Trees in Database Applications , 1979, IEEE Transactions on Software Engineering.

[16]  Lorenzo Bruzzone,et al.  A technique for the selection of kernel-function parameters in RBF neural networks for classification of remote-sensing images , 1999, IEEE Trans. Geosci. Remote. Sens..

[17]  Ming Yang,et al.  3D Convolutional Neural Networks for Human Action Recognition , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[19]  Marimuthu Palaniswami,et al.  Effects of moving the center's in an RBF network , 2002, IEEE Trans. Neural Networks.

[20]  Ioannis Pitas,et al.  Median radial basis function neural network , 1996, IEEE Trans. Neural Networks.

[21]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[22]  Brian Johnson,et al.  High-resolution urban land-cover classification using a competitive multi-scale object-based approach , 2013 .

[23]  André Stuhlsatz,et al.  Feature Extraction With Deep Neural Networks by a Generalized Discriminant Analysis , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[24]  Eric C. Rouchka,et al.  Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training , 2011, IEEE Transactions on Neural Networks.

[25]  Ling Shao,et al.  Learning Deep and Wide: A Spectral Method for Learning Deep Networks , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[26]  Wai Keung Wong,et al.  Deep Learning Regularized Fisher Mappings , 2011, IEEE Transactions on Neural Networks.

[27]  Yongdong Zhang,et al.  A Highly Parallel Framework for HEVC Coding Unit Partitioning Tree Decision on Many-core Processors , 2014, IEEE Signal Processing Letters.

[28]  Jürgen Schmidhuber,et al.  Multi-column deep neural networks for image classification , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[29]  Johan A. K. Suykens,et al.  Representative subsets for big data learning using k-NN graphs , 2014, 2014 IEEE International Conference on Big Data (Big Data).

[30]  Joshua B. Tenenbaum,et al.  Learning with Hierarchical-Deep Models , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  Yen-Jen Oyang,et al.  Data classification with radial basis function networks based on a novel kernel density estimation algorithm , 2005, IEEE Transactions on Neural Networks.

[32]  Brian Johnson,et al.  Classifying a high resolution image of an urban area using super-object information , 2013 .