Enhancing Evolving Domain Generalization through Dynamic Latent Representations

Authors

  • Binghui Xie The Chinese University of Hong Kong
  • Yongqiang Chen The Chinese University of Hong Kong
  • Jiaqi Wang The Chinese University of Hong Kong
  • Kaiwen Zhou The Chinese University of Hong Kong
  • Bo Han Hong Kong Baptist University
  • Wei Meng The Chinese University of Hong Kong
  • James Cheng The Chinese University of Hong Kong

DOI:

https://doi.org/10.1609/aaai.v38i14.29536

Keywords:

ML: Transfer, Domain Adaptation, Multi-Task Learning, CV: Representation Learning for Vision, DMKM: Mining of Spatial, Temporal or Spatio-Temporal Data, KRR: Applications, ML: Classification and Regression, ML: Evolutionary Learning, ML: Representation Learning

Abstract

Domain generalization is a critical challenge for machine learning systems. Prior domain generalization methods focus on extracting domain-invariant features across several stationary domains to enable generalization to new domains. However, in non-stationary tasks where new domains evolve in an underlying continuous structure, such as time, merely extracting the invariant features is insufficient for generalization to the evolving new domains. Nevertheless, it is non-trivial to learn both evolving and invariant features within a single model due to their conflicts. To bridge this gap, we build causal models to characterize the distribution shifts concerning the two patterns, and propose to learn both dynamic and invariant features via a new framework called Mutual Information-Based Sequential Autoencoders (MISTS). MISTS adopts information theoretic constraints onto sequential autoencoders to disentangle the dynamic and invariant features, and leverage an adaptive classifier to make predictions based on both evolving and invariant information. Our experimental results on both synthetic and real-world datasets demonstrate that MISTS succeeds in capturing both evolving and invariant information, and present promising results in evolving domain generalization tasks.

Published

2024-03-24

How to Cite

Xie, B., Chen, Y., Wang, J., Zhou, K., Han, B., Meng, W., & Cheng, J. (2024). Enhancing Evolving Domain Generalization through Dynamic Latent Representations. Proceedings of the AAAI Conference on Artificial Intelligence, 38(14), 16040-16048. https://doi.org/10.1609/aaai.v38i14.29536

Issue

Section

AAAI Technical Track on Machine Learning V