Classification of reduction invariants with improved backpropagation
暂无分享,去创建一个
Data reduction is a process of feature extraction that transforms the data space into a feature space of much lower dimension compared to the original data space, yet it retains most of the intrinsic information content of the data. This can be done by using a number of methods, such as principal component analysis (PCA), factor analysis, and feature clustering. Principal components are extracted from a collection of multivariate cases as a way of accounting for as much of the variation in that collection as possible by means of as few variables as possible. On the other hand, backpropagation network has been used extensively in classification problems such as XOR problems, share prices prediction, and pattern recognition. This paper proposes an improved error signal of backpropagation network for classification of the reduction invariants using principal component analysis, for extracting the bulk of the useful information present in moment invariants of handwritten digits, leaving the redundant information behind. Higher order centralised scaleinvariants are used to extract features of handwritten digits before PCA, and the reduction invariants are sent to the improved backpropagation model for classification purposes. 2000 Mathematics Subject Classification: 68T10. 1. Introduction. The curse of many or most neural network applications is that the number of potentially important variables can be overwhelming. There are problems whenever we deal with a very large number of variables. These include the sheer size of the computational burden can slow even the fastest computers to the point of uselessness and there can be substantial correlation between variables. The method of principal components is primarily a data-analytic technique that obtains linear transformations of a group of correlated variables such that certain optimal conditions are achieved. The most important of these conditions is that the transformed variables are uncorrelated [7]. Moment invariants have been proposed as pattern sensitive features in classification and recognition applications. Hu (1962) was the first to introduce the geometric moment invariants which are invariant under change of size, translation, and orientation [2]. Moments and functions of moments can provide characteristics of an object that uniquely represent its shape and have extensively employed as the invariant global features of an image in pattern recognition and image classification since 1960s. This paper discusses the use of principal component analysis to reduce the complexity of invariants dimension for unconstrained handwritten digits and an improved error function of backpropagation model for classification purposes. The rest of the paper’s presentation is as follows: Section 2 gives a review on moment invariants and
[1] Timothy Masters,et al. Advanced algorithms for neural networks: a C++ sourcebook , 1995 .
[2] J. Edward Jackson,et al. A User's Guide to Principal Components. , 1991 .
[3] Feng Pan,et al. A new set of moment invariants for handwritten numeral recognition , 1994, Proceedings of 1st International Conference on Image Processing.
[4] Siti Mariyam Hj. Shamsuddin,et al. Improved Scale-Invariant Moments for Deformation Digits , 2000, Int. J. Comput. Math..