Improving the Accuracy of Binarized Neural Networks and Application on Remote Sensing Data

Deep neural networks are well known to achieve outstanding results in many domains. Recently, many researchers have introduced deep neural networks into remote sensing (RS) data processing. However, typical RS data usually possesses enormous scale. Processing RS data with deep neural networks requires a rather demanding computing hardware. Most high-performance deep neural networks are associated with highly complex network structures with many parameters. This restricts their deployment for real-time processing in satellites. Many researchers have attempted overcoming this obstacle by reducing network complexity. One of the promising approaches able to reduce network computational complexity and memory usage dramatically is network binarization. In this letter, through analyzing the learning behavior of binarized neural networks (BNNs), we propose several novel strategies for improving the performance of BNNs. Empirical experiments prove these strategies to be effective in improving BNN performance for image classification tasks on both small- and large-scale data sets. We also test BNN on a remote sense data set with positive results. A detailed discussion and preliminary analysis of the strategies used in the training are provided.

[1]  Geoffrey E. Hinton,et al.  Distilling the Knowledge in a Neural Network , 2015, ArXiv.

[2]  Keqiu Li,et al.  Binary Hashing for Approximate Nearest Neighbor Search on Big Data: A Survey , 2018, IEEE Access.

[3]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[4]  Ran El-Yaniv,et al.  Binarized Neural Networks , 2016, NIPS.

[5]  Bin Wu,et al.  Densely connected deep random forest for hyperspectral imagery classification , 2018, International Journal of Remote Sensing.

[6]  Ran El-Yaniv,et al.  Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations , 2016, J. Mach. Learn. Res..

[7]  Nikos Komodakis,et al.  Wide Residual Networks , 2016, BMVC.

[8]  Bo Du,et al.  An Improved Quantum-Behaved Particle Swarm Optimization for Endmember Extraction , 2019, IEEE Transactions on Geoscience and Remote Sensing.

[9]  Ali Farhadi,et al.  XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks , 2016, ECCV.

[10]  Song Han,et al.  Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.

[11]  Li Wen,et al.  Rotation-Based Deep Forest for Hyperspectral Imagery Classification , 2019, IEEE Geoscience and Remote Sensing Letters.

[12]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[13]  Xiuping Jia,et al.  Deep Feature Extraction and Classification of Hyperspectral Images Based on Convolutional Neural Networks , 2016, IEEE Transactions on Geoscience and Remote Sensing.

[14]  Bo Du,et al.  Feature Learning Using Spatial-Spectral Hypergraph Discriminant Analysis for Hyperspectral Image , 2019, IEEE Transactions on Cybernetics.

[15]  Bo Du,et al.  Robust Graph-Based Semisupervised Learning for Noisy Labeled Data via Maximum Correntropy Criterion , 2019, IEEE Transactions on Cybernetics.

[16]  Wei Pan,et al.  Towards Accurate Binary Convolutional Neural Network , 2017, NIPS.

[17]  Ming Yang,et al.  Compressing Deep Convolutional Networks using Vector Quantization , 2014, ArXiv.

[18]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[19]  Dacheng Tao,et al.  On Compressing Deep Models by Low Rank and Sparse Decomposition , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[20]  Michael S. Bernstein,et al.  ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.

[21]  Sergey Ioffe,et al.  Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning , 2016, AAAI.

[22]  Yann Dauphin,et al.  Convolutional Sequence to Sequence Learning , 2017, ICML.

[23]  Yoshua Bengio,et al.  BinaryConnect: Training Deep Neural Networks with binary weights during propagations , 2015, NIPS.

[24]  Bartlett W. Mel,et al.  Capacity-Enhancing Synaptic Learning Rules in a Medial Temporal Lobe Online Learning Model , 2009, Neuron.