Adaptive Shortcut Debiasing for Online Continual Learning

Authors

  • Doyoung Kim KAIST, Daejeon, Republic of Korea
  • Dongmin Park KAIST, Daejeon, Republic of Korea
  • Yooju Shin KAIST, Daejeon, Republic of Korea
  • Jihwan Bang KAIST, Daejeon, Republic of Korea
  • Hwanjun Song KAIST, Daejeon, Republic of Korea
  • Jae-Gil Lee KAIST, Daejeon, Republic of Korea

DOI:

https://doi.org/10.1609/aaai.v38i12.29211

Keywords:

ML: Life-Long and Continual Learning, ML: Time-Series/Data Streams, ML: Transfer, Domain Adaptation, Multi-Task Learning

Abstract

We propose a novel framework DropTop that suppresses the shortcut bias in online continual learning (OCL) while being adaptive to the varying degree of the shortcut bias incurred by continuously changing environment. By the observed high-attention property of the shortcut bias, highly-activated features are considered candidates for debiasing. More importantly, resolving the limitation of the online environment where prior knowledge and auxiliary data are not ready, two novel techniques---feature map fusion and adaptive intensity shifting---enable us to automatically determine the appropriate level and proportion of the candidate shortcut features to be dropped. Extensive experiments on five benchmark datasets demonstrate that, when combined with various OCL algorithms, DropTop increases the average accuracy by up to 10.4% and decreases the forgetting by up to 63.2%.

Published

2024-03-24

How to Cite

Kim, D., Park, D., Shin, Y., Bang, J., Song, H., & Lee, J.-G. (2024). Adaptive Shortcut Debiasing for Online Continual Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13122-13131. https://doi.org/10.1609/aaai.v38i12.29211

Issue

Section

AAAI Technical Track on Machine Learning III