Heterogeneous Transfer Learning for Image Classification


  • Yin Zhu Hong Kong University of Science and Technology
  • Yuqiang Chen Shanghai Jiao Tong University
  • Zhongqi Lu †Hong Kong University of Science and Technology
  • Sinno Pan Institute for Infocomm Research
  • Gui-Rong Xue Shanghai Jiao Tong University
  • Yong Yu Shanghai Jiao Tong University
  • Qiang Yang Hong Kong University of Science and Technology




Transfer learning as a new machine learning paradigm has gained increasing attention lately. In situations where the training data in a target domain are not sufficient to learn predictive models effectively, transfer learning leverages auxiliary source data from other related source domains for learning. While most of the existing works in this area only focused on using the source data with the same structure as the target data, in this paper, we push this boundary further by proposing a heterogeneous transfer learning framework for knowledge transfer between text and images. We observe that for a target-domain classification problem, some annotated images can be found on many social Web sites, which can serve as a bridge to transfer knowledge from the abundant text documents available over the Web. A key question is how to effectively transfer the knowledge in the source data even though the text can be arbitrarily found. Our solution is to enrich the representation of the target images with semantic concepts extracted from the auxiliary source data through a novel matrix factorization method. By using the latent semantic features generated by the auxiliary data, we are able to build a better integrated image classifier. We empirically demonstrate the effectiveness of our algorithm on the Caltech-256 image dataset.




How to Cite

Zhu, Y., Chen, Y., Lu, Z., Pan, S., Xue, G.-R., Yu, Y., & Yang, Q. (2011). Heterogeneous Transfer Learning for Image Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 25(1), 1304-1309. https://doi.org/10.1609/aaai.v25i1.8090