Dynamic Contrastive Knowledge Distillation for Efficient Image Restoration

Authors

  • Yunshuai Zhou East China Normal University
  • Junbo Qiao East China Normal University
  • Jincheng Liao East China Normal University
  • Wei Li Huawei Noah's Ark Lab
  • Simiao Li Huawei Noah's Ark Lab
  • Jiao Xie East China Normal University
  • Yunhang Shen Xiamen University
  • Jie Hu Huawei Noah's Ark Lab
  • Shaohui Lin East China Normal University Key Laboratory of Advanced Theory and Application in Statistics and Data Science- MOE

DOI:

https://doi.org/10.1609/aaai.v39i10.33180

Abstract

Knowledge distillation (KD) is a valuable yet challenging approach that enhances a compact student network by learning from a high-performance but cumbersome teacher model. However, previous KD methods for image restoration overlook the state of the student during the distillation, adopting a fixed solution space that limits the capability of KD. Additionally, relying solely on L1-type loss struggles to leverage the distribution information of images. In this work, we propose a novel dynamic contrastive knowledge distillation (DCKD) framework for image restoration. Specifically, we introduce dynamic contrastive regularization to perceive the student's learning state and dynamically adjust the distilled solution space using contrastive learning. Additionally, we also propose a distribution mapping module to extract and align the pixel-level category distribution of the teacher and student models. Note that the proposed DCKD is a structure-agnostic distillation framework, which can adapt to different backbones and can be combined with methods that optimize upper-bound constraints to further enhance model performance. Extensive experiments demonstrate that DCKD significantly outperforms the state-of-the-art KD methods across various image restoration tasks and backbones.

Downloads

Published

2025-04-11

How to Cite

Zhou, Y., Qiao, J., Liao, J., Li, W., Li, S., Xie, J., Shen, Y., Hu, J., & Lin, S. (2025). Dynamic Contrastive Knowledge Distillation for Efficient Image Restoration. Proceedings of the AAAI Conference on Artificial Intelligence, 39(10), 10861-10869. https://doi.org/10.1609/aaai.v39i10.33180

Issue

Section

AAAI Technical Track on Computer Vision IX