Computer Systems That Learn: an Empirical Study of the Effect of Noise on the Performance of Three Classification Methods Computer Systems That Learn: an Empirical Study of the Effect of Noise on the Performance of Three Classification Methods

Abstract Classification learning systems are useful in many domain areas. One problem with the development of these systems is feature noise. Learning from examples classification methods from statistical pattern recognition, machine learning, and connectionist theory are applied to synthetic data sets possessing a known percentage of feature noise. Linear discriminant analysis, the C5.0 tree classification algorithm, and a backpropagation neural network tool are used as representative techniques from these three categories. k -Fold cross-validation is used to estimate the sensitivity of the true classification accuracy to level of feature noise present in the data sets. Results indicate that the backpropagation neural network outperforms both linear discriminant analysis and C5.0 tree classification when appreciable (10% or more of the cases) feature noise is present. These results are confirmed when the same type of empirical analysis is applied to a real-world data set previously analyzed and reported in the statistical and machine learning literature.

[1]  W. Bruce Croft,et al.  Guest editor's introduction , 2008, IEEE Expert.

[2]  Georgios Paliouras,et al.  Symbolic and Neural Learning for Named-Entity Recognition , 2000 .

[3]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[4]  I. Bratko,et al.  Learning decision rules in noisy domains , 1987 .

[5]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, CACM.

[6]  Shinichi Morishita,et al.  On Classification and Regression , 1998, Discovery Science.

[7]  Martin T. Hagan,et al.  Neural network design , 1995 .

[8]  M. Goldstein,et al.  Multivariate Analysis: Methods and Applications , 1984 .

[9]  Christopher M. Bishop,et al.  Classification and regression , 1997 .

[10]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[11]  Douglas H. Fisher,et al.  An Empirical Comparison of ID3 and Back-propagation , 1989, IJCAI.

[12]  Christof Ebert,et al.  Fuzzy classification for software criticality analysis , 1996 .

[13]  Jacek M. Zurada,et al.  Introduction to artificial neural systems , 1992 .

[14]  Sholom M. Weiss,et al.  Computer Systems That Learn , 1990 .

[15]  Peter Clark,et al.  The CN2 induction algorithm , 2004, Machine Learning.

[16]  P. Mahalanobis On the generalized distance in statistics , 1936 .

[17]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[18]  Ron Kohavi,et al.  A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.

[19]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[20]  J. Ross Quinlan,et al.  Learning Efficient Classification Procedures and Their Application to Chess End Games , 1983 .

[21]  M. Stone Cross‐Validatory Choice and Assessment of Statistical Predictions , 1976 .

[22]  J. Ross Quinlan,et al.  Induction of Decision Trees , 1986, Machine Learning.

[23]  James L. McClelland Explorations In Parallel Distributed Processing , 1988 .

[24]  John R. Anderson,et al.  MACHINE LEARNING An Artificial Intelligence Approach , 2009 .

[25]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[26]  Christof Ebert Visualization techniques for analyzing and evaluating software measures , 1992 .

[27]  Christopher J. Merz,et al.  UCI Repository of Machine Learning Databases , 1996 .

[28]  Dennis F. Kibler,et al.  Symbolic Nearest Mean Classifiers , 1997, AAAI/IAAI.

[29]  Sholom M. Weiss,et al.  An Empirical Comparison of Pattern Recognition, Neural Nets, and Machine Learning Classification Methods , 1989, IJCAI.

[30]  Marcos Dipinto,et al.  Discriminant analysis , 2020, Predictive Analytics.

[31]  Raymond J. Mooney,et al.  Symbolic and neural learning algorithms: An experimental comparison , 1991, Machine Learning.