Cross-Domain Metric Learning Based on Information Theory

Authors

  • Hao Wang Chinese Academy of Sciences
  • Wei Wang Chinese Academy of Sciences
  • Chen Zhang Chinese Academy of Sciences
  • Fanjiang Xu Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v28i1.8982

Keywords:

metric learning, transfer learning, relative entropy

Abstract

Supervised metric learning plays a substantial role in statistical classification. Conventional metric learning algorithms have limited utility when the training data and testing data are drawn from related but different domains (i.e., source domain and target domain). Although this issue has got some progress in feature-based transfer learning, most of the work in this area suffers from non-trivial optimization and pays little attention to preserving the discriminating information. In this paper, we propose a novel metric learning algorithm to transfer knowledge from the source domain to the target domain in an information-theoretic setting, where a shared Mahalanobis distance across two domains is learnt by combining three goals together: 1) reducing the distribution difference between different domains; 2) preserving the geometry of target domain data; 3) aligning the geometry of source domain data with its label information. Based on this combination, the learnt Mahalanobis distance effectively transfers the discriminating power and propagates standard classifiers across these two domains. More importantly, our proposed method has closed-form solution and can be efficiently optimized. Experiments in two real-world applications demonstrate the effectiveness of our proposed method.

Downloads

Published

2014-06-21

How to Cite

Wang, H., Wang, W., Zhang, C., & Xu, F. (2014). Cross-Domain Metric Learning Based on Information Theory. Proceedings of the AAAI Conference on Artificial Intelligence, 28(1). https://doi.org/10.1609/aaai.v28i1.8982

Issue

Section

Main Track: Novel Machine Learning Algorithms