Integrating the Supervised Information into Unsupervised Learning

This paper presents an assembling unsupervised learning framework that adopts the information coming from the supervised learning process and gives the corresponding implementation algorithm. The algorithm consists of two phases: extracting and clustering data representatives (DRs) firstly to obtain labeled training data and then classifying non-DRs based on labeled DRs. The implementation algorithm is called SDSN since it employs the tuning-scaled Support vector domain description to collect DRs, uses spectrum-based method to cluster DRs, and adopts the nearest neighbor classifier to label non-DRs. The validation of the clustering procedure of the first-phase is analyzed theoretically. A new metric is defined data dependently in the second phase to allow the nearest neighbor classifier to work with the informed information. A fast training approach for DRs’ extraction is provided to bring more efficiency. Experimental results on synthetic and real datasets verify that the proposed idea is of correctness and performance and SDSN exhibits higher popularity in practice over the traditional pure clustering procedure.

[1]  Daewon Lee,et al.  An improved cluster labeling method for support vector clustering , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Jinde Cao,et al.  Robust State Estimation for Neural Networks With Discontinuous Activations , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[3]  Anil K. Jain Data clustering: 50 years beyond K-means , 2010, Pattern Recognit. Lett..

[4]  Josef Stoer,et al.  Numerische Mathematik 1 , 1989 .

[5]  Zhou Xu,et al.  Improved support vector clustering , 2010, Eng. Appl. Artif. Intell..

[6]  Karen M. Daniels,et al.  Cone Cluster Labeling for Support Vector Clustering , 2006, SDM.

[7]  Mark A. Girolami,et al.  Mercer kernel-based clustering in feature space , 2002, IEEE Trans. Neural Networks.

[8]  Anil K. Jain Data clustering: 50 years beyond K-means , 2008, Pattern Recognit. Lett..

[9]  Paul S. Bradley,et al.  Refining Initial Points for K-Means Clustering , 1998, ICML.

[10]  Hava T. Siegelmann,et al.  Support Vector Clustering , 2002, J. Mach. Learn. Res..

[11]  Edsger W. Dijkstra,et al.  A note on two problems in connexion with graphs , 1959, Numerische Mathematik.

[12]  D. T. Lee,et al.  Two algorithms for constructing a Delaunay triangulation , 1980, International Journal of Computer & Information Sciences.

[13]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[14]  Stephan K. Chalup,et al.  CLUSTERING THROUGH PROXIMITY GRAPH MODELLING , 2002 .

[15]  Jinde Cao,et al.  Filippov systems and quasi-synchronization control for switched networks. , 2012, Chaos.

[16]  Jinde Cao,et al.  Finite-time stochastic stabilization for BAM neural networks with uncertainties , 2013, J. Frankl. Inst..

[17]  D. Cook,et al.  Graph-based hierarchical conceptual clustering , 2002 .

[18]  Shi Zhongzhi,et al.  A Direct Clustering Algorithm Based on Generalized Information Distance , 2007 .

[19]  Assaf Gottlieb,et al.  Algorithm for data clustering in pattern recognition problems based on quantum mechanics. , 2001, Physical review letters.

[20]  Jinde Cao,et al.  Dissipativity and quasi-synchronization for neural networks with discontinuous activations and parameter mismatches , 2011, Neural Networks.