Learning to Transfer with von Neumann Conditional Divergence

Authors

  • Ammar Shaker NEC Laboratories Europe
  • Shujian Yu UiT - The Arctic University of Norway; Xi'an Jiaotong University
  • Daniel Oñoro-Rubio NEC Laboratories Europe

DOI:

https://doi.org/10.1609/aaai.v36i8.20797

Keywords:

Machine Learning (ML)

Abstract

The similarity of feature representations plays a pivotal role in the success of problems related to domain adaptation. Feature similarity includes both the invariance of marginal distributions and the closeness of conditional distributions given the desired response y (e.g., class labels). Unfortunately, traditional methods always learn such features without fully taking into consideration the information in y, which in turn may lead to a mismatch of the conditional distributions or the mixup of discriminative structures underlying data distributions. In this work, we introduce the recently proposed von Neumann conditional divergence to improve the transferability across multiple domains. We show that this new divergence is differentiable and eligible to easily quantify the functional dependence between features and y. Given multiple source tasks, we integrate this divergence to capture discriminative information in y and design novel learning objectives assuming those source tasks are observed either simultaneously or sequentially. In both scenarios, we obtain favorable performance against state-of-the-art methods in terms of smaller generalization error on new tasks and less catastrophic forgetting on source tasks (in the sequential setup).

Downloads

Published

2022-06-28

How to Cite

Shaker, A., Yu, S., & Oñoro-Rubio, D. (2022). Learning to Transfer with von Neumann Conditional Divergence. Proceedings of the AAAI Conference on Artificial Intelligence, 36(8), 8231-8239. https://doi.org/10.1609/aaai.v36i8.20797

Issue

Section

AAAI Technical Track on Machine Learning III