High-Level Semantic Feature Matters Few-Shot Unsupervised Domain Adaptation
DOI:
https://doi.org/10.1609/aaai.v37i9.26306Keywords:
ML: Transfer, Domain Adaptation, Multi-Task Learning, ML: Meta LearningAbstract
In few-shot unsupervised domain adaptation (FS-UDA), most existing methods followed the few-shot learning (FSL) methods to leverage the low-level local features (learned from conventional convolutional models, e.g., ResNet) for classification. However, the goal of FS-UDA and FSL are relevant yet distinct, since FS-UDA aims to classify the samples in target domain rather than source domain. We found that the local features are insufficient to FS-UDA, which could introduce noise or bias against classification, and not be used to effectively align the domains. To address the above issues, we aim to refine the local features to be more discriminative and relevant to classification. Thus, we propose a novel task-specific semantic feature learning method (TSECS) for FS-UDA. TSECS learns high-level semantic features for image-to-class similarity measurement. Based on the high-level features, we design a cross-domain self-training strategy to leverage the few labeled samples in source domain to build the classifier in target domain. In addition, we minimize the KL divergence of the high-level feature distributions between source and target domains to shorten the distance of the samples between the two domains. Extensive experiments on DomainNet show that the proposed method significantly outperforms SOTA methods in FS-UDA by a large margin (i.e., ~10%).Downloads
Published
2023-06-26
How to Cite
Yu, L., Yang, W., Huang, S., Wang, L., & Yang, M. (2023). High-Level Semantic Feature Matters Few-Shot Unsupervised Domain Adaptation. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9), 11025-11033. https://doi.org/10.1609/aaai.v37i9.26306
Issue
Section
AAAI Technical Track on Machine Learning IV