DecAug: Out-of-Distribution Generalization via Decomposed Feature Representation and Semantic Augmentation

Authors

  • Haoyue Bai The Hong Kong University of Science and Technology
  • Rui Sun Huawei Noah's Ark Lab
  • Lanqing Hong Huawei Noah's Ark Lab
  • Fengwei Zhou Huawei Noah's Ark Lab
  • Nanyang Ye Shanghai Jiao Tong University
  • Han-Jia Ye Nanjing University
  • S.-H. Gary Chan The Hong Kong University of Science and Technology
  • Zhenguo Li Huawei Noah's Ark Lab

DOI:

https://doi.org/10.1609/aaai.v35i8.16829

Keywords:

(Deep) Neural Network Algorithms

Abstract

While deep learning demonstrates its strong ability to handle independent and identically distributed (IID) data, it often suffers from out-of-distribution (OoD) generalization, where the test data come from another distribution (w.r.t. the training one). Designing a general OoD generalization framework for a wide range of applications is challenging, mainly due to different kinds of distribution shifts in the real world, such as the shift across domains or the extrapolation of correlation. Most of the previous approaches can only solve one specific distribution shift, leading to unsatisfactory performance when applied to various OoD benchmarks. In this work, we propose DecAug, a novel decomposed feature representation and semantic augmentation approach for OoD generalization. Specifically, DecAug disentangles the category-related and context-related features by orthogonalizing the two gradients (w.r.t. intermediate features) of losses for predicting category and context labels, where category-related features contain causal information of the target object, while context-related features cause distribution shifts between training and test data. Furthermore, we perform gradient-based augmentation on context-related features to improve the robustness of learned representations. Experimental results show that DecAug outperforms other state-of-the-art methods on various OoD datasets, which is among the very few methods that can deal with different types of OoD generalization challenges.

Downloads

Published

2021-05-18

How to Cite

Bai, H., Sun, R., Hong, L., Zhou, F., Ye, N., Ye, H.-J., Chan, S.-H. G., & Li, Z. (2021). DecAug: Out-of-Distribution Generalization via Decomposed Feature Representation and Semantic Augmentation. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8), 6705-6713. https://doi.org/10.1609/aaai.v35i8.16829

Issue

Section

AAAI Technical Track on Machine Learning I