Co-MLM: A SSL Algorithm Based on the Minimal Learning Machine

Semi-supervised learning is a challenging topic in machine learning that has attracted much attention in recent years. The availability of huge volumes of data and the work necessary to label all these data are two of the reasons that can explain this interest. Among the various methods for semi-supervised learning, the co-training framework has become popular due to its simple formulation and promising results. In this work, we propose Co-MLM, a semi-supervised learning algorithm based on a recently supervised method named Minimal Learning Machine (MLM), built upon co-training framework. Experiments on UCI data sets showed that Co-MLM has promising performance in compared to other co-training style algorithms.

[1]  Xiaojin Zhu,et al.  --1 CONTENTS , 2006 .

[2]  Rossana M. de Castro Andrade,et al.  MLM-rank: A Ranking Algorithm Based on the Minimal Learning Machine , 2015, 2015 Brazilian Conference on Intelligent Systems (BRACIS).

[3]  Han Min A semi-supervised learning method based on extreme learning machine , 2010 .

[4]  Avrim Blum,et al.  Learning from Labeled and Unlabeled Data using Graph Mincuts , 2001, ICML.

[5]  Yang Li,et al.  A general framework for co-training and its applications , 2015, Neurocomputing.

[6]  Michael D. Colagrosso,et al.  Algorithmic integrability tests for nonlinear differential and lattice equations , 1998, solv-int/9803005.

[7]  Claire Cardie,et al.  Limitations of Co-Training for Natural Language Learning from Large Datasets , 2001, EMNLP.

[8]  D. Marquardt An Algorithm for Least-Squares Estimation of Nonlinear Parameters , 1963 .

[9]  Zhi-Hua Zhou,et al.  Improve Computer-Aided Diagnosis With Machine Learning Techniques Using Undiagnosed Samples , 2007, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[10]  Irena Koprinska,et al.  Co-training using RBF Nets and Different Feature Splits , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[11]  Meng Wang,et al.  Semisupervised Multiview Distance Metric Learning for Cartoon Synthesis , 2012, IEEE Transactions on Image Processing.

[12]  Avrim Blum,et al.  The Bottleneck , 2021, Monopsony Capitalism.

[13]  Mubarak Shah,et al.  Macro-class Selection for Hierarchical k-NN Classification of Inertial Sensor Data , 2012, PECCS.

[14]  Amaury Lendasse,et al.  Minimal Learning Machine: A New Distance-Based Method for Supervised Learning , 2013, IWANN.

[15]  Zhi-Hua Zhou,et al.  Tri-training: exploiting unlabeled data using three classifiers , 2005, IEEE Transactions on Knowledge and Data Engineering.

[16]  João Paulo Pordeus Gomes,et al.  Ensemble of Minimal Learning Machines for Pattern Classification , 2015, IWANN.

[17]  Ewa Niewiadomska-Szynkiewicz,et al.  Optimization Schemes For Wireless Sensor Network Localization , 2009, Int. J. Appl. Math. Comput. Sci..

[18]  João Paulo Pordeus Gomes,et al.  A Minimal Learning Machine for Datasets with Missing Values , 2015, ICONIP.

[19]  Sebastian Thrun,et al.  Text Classification from Labeled and Unlabeled Documents using EM , 2000, Machine Learning.

[20]  Xiaojun Wan,et al.  Co-Training for Cross-Lingual Sentiment Classification , 2009, ACL.

[21]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[22]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[23]  Ulf Brefeld,et al.  Multi-view learning with dependent views , 2015, SAC.

[24]  Yan Zhou,et al.  Democratic co-learning , 2004, 16th IEEE International Conference on Tools with Artificial Intelligence.

[25]  Ayhan Demiriz,et al.  Semi-Supervised Support Vector Machines , 1998, NIPS.

[26]  Hae-Sang Park,et al.  A simple and fast algorithm for K-medoids clustering , 2009, Expert Syst. Appl..