Keyword propagation for image retrieval

In this paper, a keyword propagation framework is proposed to seamlessly combine the keyword and visual representations in image retrieval. In this framework, a set of statistical models is built based on the visual features of the user-labeled images to represent semantic concepts, and used to propagate keywords to other unlabeled images. These models are updated periodically when more images implicitly labeled by users become available through relevance feedback. In this sense, the keyword models serve the function of accumulation and memorization of knowledge learnt from user-provided relevance feedbacks. To perform relevance feedback, keyword models are combined with a visual feature-based learning scheme using support vector machines. Experimental results on a large-scale database demonstrate the effectiveness of the proposed framework.