One-Side Probability Machine: Learning Imbalanced Classifiers Locally and Globally

Imbalanced learning is a challenged task in machine learning, where the data associated with one class are far fewer than those associated with the other class. In this paper, we propose a novel model called One-Side Probability Machine OSPM able to learn from imbalanced data rigorously and accurately. In particular, OSPM can lead to a rigorous treatment on biased or imbalanced classification tasks, which is significantly different from previous approaches. Importantly, the proposed OSPM exploits the reliable global information from one side only, i.e., the majority class , while engaging the robust local learning [2] from the other side, i.e., the minority class. Such setting proves much effective than other models such as Biased Minimax Probability Machine BMPM. To our best knowledge, OSPM presents the first model capable of learning from imbalanced data both locally and globally. Our proposed model has also established close connections with various famous models such as BMPM and Support Vector Machine. One appealing feature is that the optimization problem involved can be cast as a convex second order conic programming problem with a global optimum guaranteed. A series of experiments on three data sets demonstrate the advantages of our proposed method against four competitive approaches.