Fast Generalized Distillation for Semi-Supervised Domain Adaptation

Authors

  • Shuang Ao Western University
  • Xiang Li Western University
  • Charles Ling Western University

DOI:

https://doi.org/10.1609/aaai.v31i1.10848

Keywords:

Domain Adaptation, Generalized Distillation

Abstract

Semi-supervised domain adaptation (SDA) is a typical setting when we face the problem of domain adaptation in real applications. How to effectively utilize the unlabeled data is an important issue in SDA. Previous work requires access to the source data to measure the data distribution mismatch, which is ineffective when the size of the source data is relatively large. In this paper, we propose a new paradigm, called Generalized Distillation Semi-supervised Domain Adaptation (GDSDA). We show that without accessing the source data, GDSDA can effectively utilize the unlabeled data to transfer the knowledge from the source models. Then we propose GDSDA-SVM which uses SVM as the base classifier and can efficiently solve the SDA problem. Experimental results show that GDSDA-SVM can effectively utilize the unlabeled data to transfer the knowledge between different domains under the SDA setting.

Downloads

Published

2017-02-13

How to Cite

Ao, S., Li, X., & Ling, C. (2017). Fast Generalized Distillation for Semi-Supervised Domain Adaptation. Proceedings of the AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.10848