Rapid Clustering with Semi-Supervised Ensemble Density Centers

Clustering algorithm regards robustness, stability, accuracy as performance measurement. In most recent studies, there has been no detailed investigation of resource usage such as CPU, memory, and executing time in most recent studies, one unanticipated finding was that several consensus functions waste many resources to give the outcomes of a small features dataset. In order to resolve these issues, we suggest two new measurement aspects that termed as resource usage and rapidity. Thus, we proposed a new method characterized by rapid implementation, as well as improved the accuracy with the best pure result. This model termed as Rapid Clustering with Semi-supervised Ensemble Density Centers. The strengths of our model is based on fewer iterations and depend on formulas and built-in functions instead of complex coding procedures, then pick objects with its features as cluster centers, and finally, utilize the semi-supervised learning specifically the pairwise constraints, and take advantage of density calculation. Furthermore, the experimental results demonstrated that our model gained the results rapidly with best accuracy and purity.

[1]  Tu Chong-yang,et al.  Semi-supervised Affinity Propagation Clustering , 2007 .

[2]  Chang-Dong Wang,et al.  Enhanced Ensemble Clustering via Fast Propagation of Cluster-Wise Similarities , 2018, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[3]  Yunni Xia,et al.  Efficient Clustering Method Based on Density Peaks With Symmetric Neighborhood Relationship , 2019, IEEE Access.

[4]  Sean Hughes,et al.  Clustering by Fast Search and Find of Density Peaks , 2016 .

[5]  Joydeep Ghosh,et al.  Cluster Ensembles --- A Knowledge Reuse Framework for Combining Multiple Partitions , 2002, J. Mach. Learn. Res..

[6]  Jiucheng Xu,et al.  An Adaptive Density Peaks Clustering Method With Fisher Linear Discriminant , 2019, IEEE Access.

[7]  Alex Krizhevsky,et al.  Learning Multiple Layers of Features from Tiny Images , 2009 .

[8]  Christopher D. Manning,et al.  Introduction to Information Retrieval , 2010, J. Assoc. Inf. Sci. Technol..

[9]  Junping Du,et al.  Deep low-rank subspace ensemble for multi-view clustering , 2019, Inf. Sci..

[10]  Qingming Huang,et al.  Beyond global fusion: A group-aware fusion approach for multi-view image clustering , 2019, Inf. Sci..

[11]  André R. S. Marçal,et al.  Evaluation of Features for Leaf Discrimination , 2013, ICIAR.

[12]  Daoqiang Zhang,et al.  Weighted Spectral Cluster Ensemble , 2015, 2015 IEEE International Conference on Data Mining.

[13]  Sébastien Lafond,et al.  Video transcoding time prediction for proactive load balancing , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[14]  Hamid Parvin,et al.  Clustering ensemble selection considering quality and diversity , 2015, Artificial Intelligence Review.

[15]  Weiguo Sheng,et al.  A Multi-objective Clustering Ensemble Algorithm with Automatic k-Determination , 2019, 2019 IEEE 4th International Conference on Cloud Computing and Big Data Analysis (ICCCBDA).

[16]  Anil K. Jain,et al.  Clustering ensembles: models of consensus and weak partitions , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Md Abdul Masud,et al.  Generate pairwise constraints from unlabeled data for semi-supervised clustering , 2019, Data Knowl. Eng..

[18]  Yuhua Qian,et al.  Clustering ensemble based on sample's stability , 2019, Artif. Intell..

[19]  Qiang Fu,et al.  Ensemble clustering based on evidence extracted from the co-association matrix , 2019, Pattern Recognit..