Unified Representation Causal Prompt Distillation for Re-Inference-Free Lifelong Person Re-Identification
DOI:
https://doi.org/10.1609/aaai.v40i15.38315Abstract
Lifelong person re-identification (LReID) aims to retrieve the target person from sequentially collected data. Due to significant domain gaps between datasets and the continuous increase of training data from different scenarios, weak inter-domain generalization and catastrophic forgetting issues have remained major challenges for LReID. To tackle these issues, a novel LReID method called Unified Representation Causal Prompt Distillation (URCPD) is proposed. Specifically, to reduce domain gaps among different scene datasets and improve model inter-domain generalization capability, a Feature Decoupling Style Transfer module (FDST) is proposed to map new features into a unified feature space. Furthermore, to reduce the accumulated forgetting of old knowledge during the training stage, a Causal Prompt Distillation module (CPD) is introduced. This module eliminates the re-inference process for distillation and embeds memory prompts to combat catastrophic forgetting. Extensive experiments on five classic LReID seen datasets and seven unseen datasets demonstrate that our method significantly outperforms state-of-the-art methods.Downloads
Published
2026-03-14
How to Cite
Zhao, J., Luo, J., Zhou, Y., Du, W.-L., Li, X., & Yao, R. (2026). Unified Representation Causal Prompt Distillation for Re-Inference-Free Lifelong Person Re-Identification. Proceedings of the AAAI Conference on Artificial Intelligence, 40(15), 13144–13152. https://doi.org/10.1609/aaai.v40i15.38315
Issue
Section
AAAI Technical Track on Computer Vision XII