Infinite Bayesian Max-Margin Discriminant Projection.

In this article, considering the supervised dimensionality reduction, we first propose a model, called infinite Bayesian max-margin linear discriminant projection (iMMLDP), by assembling a set of local regions, where we make use of Bayesian nonparametric priors to handle the model selection problem, for example, the underlying number of local regions. In each local region, our model jointly learns a discriminative subspace and the corresponding classifier. Under this framework, iMMLDP combines dimensionality reduction, clustering, and classification in a principled way. Moreover, to deal with more complex data, for example, a local nonlinear separable structure, we extend the linear projection to a nonlinear case based on the kernel trick and develop an infinite kernel max-margin discriminant projection (iKMMDP) model. Thanks to the conjugate property, the parameters in these two models can be inferred efficiently via the Gibbs sampler. Finally, we implement our models on synthesized and real-world data, including multimodally distributed datasets and measured radar image data, to validate their efficiency and effectiveness.