When performing visualization and classification, people often confront the problem of dimensionality reduction. Isomap is one of the most promising nonlinear dimensionality reduction techniques. However, when Isomap is applied to real-world data, it shows some limitations, such as being sensitive to noise. In this paper, an improved version of Isomap, namely S-Isomap, is proposed. S-Isomap utilizes class information to guide the procedure of nonlinear dimensionality reduction. Such a kind of procedure is called supervised nonlinear dimensionality reduction. In S-Isomap, the neighborhood graph of the input data is constructed according to a certain kind of dissimilarity between data points, which is specially designed to integrate the class information. The dissimilarity has several good properties which help to discover the true neighborhood of the data and, thus, makes S-Isomap a robust technique for both visualization and classification, especially for real-world problems. In the visualization experiments, S-Isomap is compared with Isomap, LLE, and WeightedIso. The results show that S-Isomap performs the best. In the classification experiments, S-Isomap is used as a preprocess of classification and compared with Isomap, WeightedIso, as well as some other well-established classification methods, including the K-nearest neighbor classifier, BP neural network, J4.8 decision tree, and SVM. The results reveal that S-Isomap excels compared to Isomap and WeightedIso in classification, and it is highly competitive with those well-known classification methods.
[1]
Aiko M. Hormann,et al.
Programs for Machine Learning. Part I
,
1962,
Inf. Control..
[2]
Geoffrey E. Hinton,et al.
Learning representations by back-propagating errors
,
1986,
Nature.
[3]
David L. Waltz,et al.
Toward memory-based reasoning
,
1986,
CACM.
[4]
Philip D. Wasserman,et al.
Advanced methods in neural computing
,
1993,
VNR computer library.
[5]
Pierre Comon,et al.
Independent component analysis, A new concept?
,
1994,
Signal Process..
[6]
David G. Lowe,et al.
Similarity Metric Learning for a Variable-Kernel Classifier
,
1995,
Neural Computation.
[7]
Tin Kam Ho,et al.
Nearest Neighbors in Random Subspaces
,
1998,
SSPR/SPR.
[8]
Catherine Blake,et al.
UCI Repository of machine learning databases
,
1998
.
[9]
J. Tenenbaum,et al.
A global geometric framework for nonlinear dimensionality reduction.
,
2000,
Science.
[10]
Vladimir N. Vapnik,et al.
The Nature of Statistical Learning Theory
,
2000,
Statistics for Engineering and Information Science.
[11]
S T Roweis,et al.
Nonlinear dimensionality reduction by locally linear embedding.
,
2000,
Science.
[12]
Dimitrios Gunopulos,et al.
Non-linear dimensionality reduction techniques for classification and visualization
,
2002,
KDD.
[13]
Lawrence K. Saul,et al.
Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifold
,
2003,
J. Mach. Learn. Res..