Convolution Neural Networks (CNN) have recently achieved state-of-the art performance on handwritten Chinese character recognition (HCCR). However, most of CNN models employ the SoftMax activation function and minimize cross entropy loss, which may cause loss of inter-class information. To cope with this problem, we propose to combine cross entropy with similarity ranking function and use it as loss function. The experiments results show that the combination loss functions produce higher accuracy in HCCR. This report briefly reviews cross entropy loss function, a typical similarity ranking function: Euclidean distance, and also propose a new similarity ranking function: Average variance similarity. Experiments are done to compare the performances of a CNN model with three different loss functions. In the end, SoftMax cross entropy with Average variance similarity produce the highest accuracy on handwritten Chinese characters recognition.
[1]
Shamik Sural,et al.
Similarity between Euclidean and cosine angle distance for nearest neighbor queries
,
2004,
SAC '04.
[2]
Pascal Vincent,et al.
Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion
,
2010,
J. Mach. Learn. Res..
[3]
Jiye Liang,et al.
Information granules and entropy theory in information systems
,
2008,
Science in China Series F: Information Sciences.
[4]
Geoffrey E. Hinton,et al.
ImageNet classification with deep convolutional neural networks
,
2012,
Commun. ACM.
[5]
Shie Mannor,et al.
A Tutorial on the Cross-Entropy Method
,
2005,
Ann. Oper. Res..
[6]
Christian Viard-Gaudin,et al.
A Convolutional Neural Network Approach for Objective Video Quality Assessment
,
2006,
IEEE Transactions on Neural Networks.