New Architecture of Correlated Weights Neural Network for Global Image Transformations

The paper describes a new extension of the convolutional neural network concept. The developed network, similarly to the CNN, instead of using independent weights for each neuron in the network uses related weights. This results in a small number of parameters optimized in the learning process, and high resistance to overtraining. However unlike the CNN, instead of sharing weights, the network takes advantage of weights correlated with coordinates of a neuron and its inputs, calculated by a dedicated subnet. This solution allows the neural layer of the network to perform global transformation of patterns what was unachievable for convolutional layers. The new network concept has been confirmed by verification of its ability to perform typical image affine transformations such as translation, scaling and rotation.

[1]  Yong Luo,et al.  Automatic Tumor Segmentation with Deep Convolutional Neural Networks for Radiotherapy Applications , 2017, Neural Processing Letters.

[2]  Bernard Widrow,et al.  Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[3]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[4]  Slawomir Golak,et al.  Induced Weights Artificial Neural Network , 2005, ICANN.

[5]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[6]  Gilson A. Giraldi,et al.  Convolutional Neural Network approaches to granite tiles classification , 2017, Expert Syst. Appl..

[7]  Andrew Zisserman,et al.  Spatial Transformer Networks , 2015, NIPS.

[8]  Yuanyuan Zhang,et al.  Adaptive Convolutional Neural Network and Its Application in Face Recognition , 2016, Neural Processing Letters.

[9]  Gerald Penn,et al.  Convolutional Neural Networks for Speech Recognition , 2014, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[10]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[11]  Hazem M. Abbas,et al.  Neural Networks Pipeline for Offline Machine Printed Arabic OCR , 2017, Neural Processing Letters.

[12]  Christian Igel,et al.  Empirical evaluation of the improved Rprop learning algorithms , 2003, Neurocomputing.