A comparative study of different classifiers for handprinted character recognition

In this paper, we present a comparative study of four different classifiers for isolated handprinted character recognition. These four classifiers are i) a nearest template (NT) classifier, ii) an enhanced nearest template (ENT) classifier, iii) a standard feedforward neural network (FNN) classifier, and iv) a hybrid classifier. The NT classifier is a variation of the nearest neighbor classifier which stores a small number of templates (or prototypes) and their statistics generated by a special clustering algorithm. Motivated by radial basis function networks, the ENT classifier is proposed to augment the NT classifier with an optimal transform which maps the distances generated by the NT classifier to character categories. The FNN classifier is a 3-layer (with one hidden layer) feedforward network trained using the backpropagation algorithm. The hybrid classifier combines results from the FNN and NT classifiers in an efficient way to improve the recognition accuracy with only a slight increase in computation. In this paper, we evaluate the performance of these four classifiers in terms of recognition accuracy, top 3 coverage rate, and recognition speed, using the NIST isolated lower-case alphabet database. Our experiments show that the FNN classifier outperforms the NT and ENT classifiers in all the three evaluation criteria. The hybrid classifier achieves the best recognition accuracy at a cost of little extra computation over the FNN classifier. The ENT classifier can significantly improve the recognition accuracy of the NT classifier when a small number of templates is used.

[1]  T. Kohonen,et al.  Statistical pattern recognition with neural networks: benchmarking studies , 1988, IEEE 1988 International Conference on Neural Networks.

[2]  Nobuyasu Itoh,et al.  Handwritten numeral verification method using distribution maps of structural features , 1990, Other Conferences.

[3]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[4]  G. Gates,et al.  The reduced nearest neighbor rule (Corresp.) , 1972, IEEE Trans. Inf. Theory.

[5]  Keinosuke Fukunaga,et al.  A Branch and Bound Algorithm for Computing k-Nearest Neighbors , 1975, IEEE Transactions on Computers.

[6]  Peter E. Hart,et al.  The condensed nearest neighbor rule (Corresp.) , 1968, IEEE Trans. Inf. Theory.

[7]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[8]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[9]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[10]  Dennis L. Wilson,et al.  Asymptotic Properties of Nearest Neighbor Rules Using Edited Data , 1972, IEEE Trans. Syst. Man Cybern..

[11]  Patrick J. Grother,et al.  The First Census Optical Character Recognition Systems Conference | NIST , 1992 .

[12]  Majid Ahmadi,et al.  Statistical and neural classification of handwritten numerals: a comparative study , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[13]  D.R. Hush,et al.  Progress in supervised neural networks , 1993, IEEE Signal Processing Magazine.

[14]  Richard Lippmann,et al.  Neural Network Classifiers Estimate Bayesian a posteriori Probabilities , 1991, Neural Computation.