Mutual Nearest Neighbor Contrast and Hybrid Prototype Self-Training for Universal Domain Adaptation


  • Liang Chen Peking University
  • Qianjin Du Huawei
  • Yihang Lou Huawei
  • Jianzhong He Huawei
  • Tao Bai Huawei
  • Minghua Deng Peking University



Machine Learning (ML), Computer Vision (CV)


Universal domain adaptation (UniDA) aims to transfer knowledge learned from a labeled source domain to an unlabeled target domain under domain shift and category shift. Without prior category overlap information, it is challenging to simultaneously align the common categories between two domains and separate their respective private categories. Additionally, previous studies utilize the source classifier's prediction to obtain various known labels and one generic "unknown" label of target samples. However, over-reliance on learned classifier knowledge is inevitably biased to source data, ignoring the intrinsic structure of target domain. Therefore, in this paper, we propose a novel two-stage UniDA framework called MATHS based on the principle of mutual nearest neighbor contrast and hybrid prototype discrimination. In the first stage, we design an efficient mutual nearest neighbor contrastive learning scheme to achieve feature alignment, which exploits the instance-level affinity relationship to uncover the intrinsic structure of two domains. We introduce a bimodality hypothesis for the maximum discriminative probability distribution to detect the possible target private samples, and present a data-based statistical approach to separate the common and private categories. In the second stage, to obtain more reliable label predictions, we propose an incremental pseudo-classifier for target data only, which is driven by the hybrid representative prototypes. A confidence-guided prototype contrastive loss is designed to optimize the category allocation uncertainty via a self-training mechanism. Extensive experiments on three benchmarks demonstrate that MATHS outperforms previous state-of-the-arts on most UniDA settings.




How to Cite

Chen, L., Du, Q., Lou, Y., He, J., Bai, T., & Deng, M. (2022). Mutual Nearest Neighbor Contrast and Hybrid Prototype Self-Training for Universal Domain Adaptation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6248-6257.



AAAI Technical Track on Machine Learning I