Using Similarity between Paired Instances to Improve Multiple-Instance Learning via Embedded Instance Selection

Multiple-instance Learning MIL copes with classification of sets of instances named bags, as opposed to the traditional view that aims at learning from single instances. Recently, several instance selection-based MIL algorithms have been presented to tackle the MIL problem. Multiple-Instance Learning via Embedded Instance Selection MILES is so far the most effective one among them, at least in our experiments. However, MILES regards all instances in the training set as initial instance prototypes, which leads to high complexity for both feature mapping and classifier learning. In this paper, we try to address this issue based on the similarity between paired instances within a bag. The main idea is choosing a pair of instances with the lowest similarity value from each bag and using all such pairs of instances as initial instance prototypes that are applied to MILES instead of the original set of initial instance prototypes. The evaluation on two benchmark datasets demonstrates that our approach can significantly improve the efficiency of MILES while maintaining or even strengthening its effectivenss.

[1]  Murat Dundar,et al.  Bayesian multiple instance learning: automatic feature selection and inductive transfer , 2008, ICML '08.

[2]  Jan Ramon,et al.  Multi instance neural networks , 2000, ICML 2000.

[3]  Yixin Chen,et al.  Image Categorization by Learning and Reasoning with Regions , 2004, J. Mach. Learn. Res..

[4]  Boris Babenko,et al.  Multiple Instance Learning with Manifold Bags , 2011, ICML.

[5]  Mohammad H. Poursaeidi,et al.  Robust support vector machines for multiple instance learning , 2012, Annals of Operations Research.

[6]  Qi Zhang,et al.  Content-Based Image Retrieval Using Multiple-Instance Learning , 2002, ICML.

[7]  Oded Maron,et al.  Multiple-Instance Learning for Natural Scene Classification , 1998, ICML.

[8]  Thomas G. Dietterich,et al.  Solving the Multiple Instance Problem with Axis-Parallel Rectangles , 1997, Artif. Intell..

[9]  James T. Kwok,et al.  Online multiple instance learning with no regret , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[10]  Qi Zhang,et al.  EM-DD: An Improved Multiple-Instance Learning Technique , 2001, NIPS.

[11]  Yixin Chen,et al.  MILES: Multiple-Instance Learning via Embedded Instance Selection , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Paul A. Viola,et al.  Multiple Instance Boosting for Object Detection , 2005, NIPS.

[13]  Hui Zhang,et al.  Localized Content-Based Image Retrieval , 2008, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Wu-Jun Li,et al.  MILD: Multiple-Instance Learning via Disambiguation , 2010, IEEE Transactions on Knowledge and Data Engineering.

[15]  Tomás Lozano-Pérez,et al.  A Framework for Multiple-Instance Learning , 1997, NIPS.

[16]  Ming-Hsuan Yang,et al.  Robust Object Tracking with Online Multiple Instance Learning , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Jun Wang,et al.  Solving the Multiple-Instance Problem: A Lazy Learning Approach , 2000, ICML.

[18]  Jun Zhou,et al.  MILIS: Multiple Instance Learning with Instance Selection , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Oded Maron,et al.  Learning from Ambiguity , 1998 .

[20]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[21]  Thomas Hofmann,et al.  Support Vector Machines for Multiple-Instance Learning , 2002, NIPS.

[22]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .