Ranking-oriented nearest-neighbor based method for automatic image annotation

Automatic image annotation plays a critical role in keyword-based image retrieval systems. Recently, the nearest-neighbor based scheme has been proposed and achieved good performance for image annotation. Given a new image, the scheme is to first find its most similar neighbors from labeled images, and then propagate the keywords associated with the neighbors to it. Many studies focused on designing a suitable distance metric between images so that all labeled images can be ranked by their distance to the given image. However, higher accuracy in distance prediction does not necessarily lead to better ordering of labeled images. In this paper, we propose a ranking-oriented neighbor search mechanism to rank labeled images directly without going through the intermediate step of distance prediction. In particular, a new learning to rank algorithm is developed, which exploits the implicit preference information of labeled images and underlines the accuracy of the top-ranked results. Experiments on two benchmark datasets demonstrate the effectiveness of our approach for image annotation.

[1]  Marcel Worring,et al.  Learning Social Tag Relevance by Neighbor Voting , 2009, IEEE Transactions on Multimedia.

[2]  Yang Yu,et al.  Automatic image annotation using group sparsity , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[3]  Tie-Yan Liu,et al.  Learning to rank for information retrieval , 2009, SIGIR.

[4]  Vladimir Pavlovic,et al.  A New Baseline for Image Annotation , 2008, ECCV.

[5]  Yoram Singer,et al.  An Efficient Boosting Algorithm for Combining Preferences by , 2013 .

[6]  Shuicheng Yan,et al.  Multi-label sparse coding for automatic image annotation , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.