Learning to Prompt Knowledge Transfer for Open-World Continual Learning
DOI:
https://doi.org/10.1609/aaai.v38i12.29275Keywords:
ML: Life-Long and Continual Learning, DMKM: Graph Mining, Social Network Analysis & Community, KRR: Knowledge Representation Languages, ML: Multi-instance/Multi-view Learning, ML: Transfer, Domain Adaptation, Multi-Task LearningAbstract
This paper studies the problem of continual learning in an open-world scenario, referred to as Open-world Continual Learning (OwCL). OwCL is increasingly rising while it is highly challenging in two-fold: i) learning a sequence of tasks without forgetting knowns in the past, and ii) identifying unknowns (novel objects/classes) in the future. Existing OwCL methods suffer from the adaptability of task-aware boundaries between knowns and unknowns, and do not consider the mechanism of knowledge transfer. In this work, we propose Pro-KT, a novel prompt-enhanced knowledge transfer model for OwCL. Pro-KT includes two key components: (1) a prompt bank to encode and transfer both task-generic and task-specific knowledge, and (2) a task-aware open-set boundary to identify unknowns in the new tasks. Experimental results using two real-world datasets demonstrate that the proposed Pro-KT outperforms the state-of-the-art counterparts in both the detection of unknowns and the classification of knowns markedly. Code released at https://github.com/YujieLi42/Pro-KT.Downloads
Published
2024-03-24
How to Cite
Li, Y., Yang, X., Wang, H., Wang, X., & Li, T. (2024). Learning to Prompt Knowledge Transfer for Open-World Continual Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13700-13708. https://doi.org/10.1609/aaai.v38i12.29275
Issue
Section
AAAI Technical Track on Machine Learning III