Knowledge Transferring for Image Classification

Traditional image classification approaches focused on utilizing a host of target data to learn an efficient classification model. However, these methods were generally based on the target data without considering auxiliary data. If the knowledge from auxiliary data could be successfully transferred to the target data, the performance of the model would be improved. In recent years, transfer learning has emerged to address this problem. Based on transfer learning, we present a knowledge transferring method to enhance the image classification performance. Since the target data are merely limited on images, we employ an auxiliary dataset to construct the pseudo text for each target image. By exploiting the semantic structure of the pseudo text data, the visual features are mapped to the semantic space which respects the text structure. Experiments show that the proposed approach in this paper is feasible.