NON-NEGATIVE MATRIX FACTORIZATION AND CLASSIFIERS : EXPERIMENTAL STUDY

Non-negative matrix factorization (NMF) is one of the recently emerged dimensionality reduction methods. Unlike other methods, NMF is based on non-negative constraints, which allow to learn parts from objects. In this paper, we combine NMF with four classifiers (nearest neighbor, kernel nearest neighbor, k-local hyperplane distance nearest neighbor and support vector machine) in order to investigate the influence of dimensionality reduction performed by NMF on the accuracy rate of the classifiers and establish when NMF is useless. Experiments were conducted on three real-world datasets (Japanese Female Facial Expression, UCI Sonar and UCI BUPA liver disorder). The first dataset contains face images as patterns whereas patterns in two others are composed of numerical measurements not constituting any real physical objects when assembled together. The preliminary conclusion is that while NMF turned out to be useful for lowering a dimensionality of face images, it caused a degradation in classification accuracy when applied to other two datasets. It indicates that NMF may not be good for datasets where patterns cannot be decomposed into meaningful parts, though this thought requires further, more detailed, exploration. As for classifiers, k-local hyperplane distance nearest neighbor demonstrated a very good performance, often outperforming other tested classifiers.

[1]  Baowen Xu,et al.  A constrained non-negative matrix factorization in information retrieval , 2003, Proceedings Fifth IEEE Workshop on Mobile Computing Systems and Applications.

[2]  Jordi Vitrià,et al.  Evaluation of distance metrics for recognition based on non-negative matrix factorization , 2003, Pattern Recognit. Lett..

[3]  Danna Zhou,et al.  d. , 1840, Microbial pathogenesis.

[4]  Kenji Kita,et al.  Dimensionality reduction using non-negative matrix factorization for information retrieval , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[5]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[6]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[7]  Stan Z. Li,et al.  Learning spatially localized, parts-based representation , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[8]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[9]  Vladimir Cherkassky,et al.  The Nature Of Statistical Learning Theory , 1997, IEEE Trans. Neural Networks.

[10]  Yunde Jia,et al.  FISHER NON-NEGATIVE MATRIX FACTORIZATION FOR LEARNING LOCAL FEATURES , 2004 .

[11]  Bernt Schiele,et al.  Introducing a weighted non-negative matrix factorization for image classification , 2003, Pattern Recognit. Lett..

[12]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[13]  Pascal Vincent,et al.  K-Local Hyperplane and Convex Distance Nearest Neighbor Algorithms , 2001, NIPS.