High Dimensionality Reduction Using CUR Matrix Decomposition and Auto-encoder for Web Image Classification

Reducing the dimensionality of image with high-dimensional feature plays a significant role in image retrieval and classification. Recently, two methods have been proposed to improve the efficiency and accuracy of dimensionality reduction, one uses CUR matrix decompositions to construct low rank matrix approximations and another approach for dimension reduction trains an auto-encoder with deep architecture to learn low-dimensional codes. In this paper, after above two mentioned methods are respectively utilized to reduce the high-dimensional features of images, we train individual classifiers on both original and reduced feature space for image classification. This paper compares these two approaches with other approaches in image classification. At the same, we also study the effects of the depth of layers on the performance of dimensionality reduction using auto-encoder.

[1]  Gunnar Rätsch,et al.  Kernel PCA and De-Noising in Feature Spaces , 1998, NIPS.

[2]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[3]  Ioannis Pitas,et al.  Nonnegative Matrix Factorization in Polynomial Feature Space , 2008, IEEE Transactions on Neural Networks.

[4]  Yoshua. Bengio,et al.  Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..

[5]  A. J. Collins,et al.  Introduction To Multivariate Analysis , 1981 .

[6]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[7]  S. Muthukrishnan,et al.  Relative-Error CUR Matrix Decompositions , 2007, SIAM J. Matrix Anal. Appl..

[8]  Tat-Seng Chua,et al.  NUS-WIDE: a real-world web image database from National University of Singapore , 2009, CIVR '09.

[9]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[10]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[11]  Meng Wang,et al.  MSRA-MM 2.0: A Large-Scale Web Multimedia Dataset , 2009, 2009 IEEE International Conference on Data Mining Workshops.

[12]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[13]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[14]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[15]  Paul Geladi,et al.  Principal Component Analysis , 1987, Comprehensive Chemometrics.

[16]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[17]  Aapo Hyvärinen,et al.  Survey on Independent Component Analysis , 1999 .

[18]  Petros Drineas,et al.  CUR matrix decompositions for improved data analysis , 2009, Proceedings of the National Academy of Sciences.