Co-Progression Knowledge Distillation with Knowledge Prototype for Industrial Anomaly Detection

Authors

  • Bokang Yang Huazhong University of Science and Technology
  • Zhe Zhang Huazhong University of Science and Technology
  • Jie Ma Huazhong University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v39i12.33419

Abstract

Unsupervised anomaly detection has emerged as a powerful technique for identifying abnormal patterns in images without relying on pre-labeled defective samples. Many unsupervised methods use pre-trained feature extractors from large datasets, with knowledge distillation between teacher and student models being a leading technique. However, due to the similar structures of teacher and student, these methods face challenges like excessive specialization and inadequate generalization, reducing detection performance. In this paper, we introduce a Co-Progression Knowledge Distillation (CPKD) framework, enabling bidirectional learning between teacher and student models. This innovative framework enables concurrent evolution of both models, fostering mutual improvement and enhanced adaptability. To maintain system stability and prevent overspecialization, we introduce a knowledge prototype as a regulatory mechanism for the teacher's learning process. Our method effectively addresses key challenges in anomaly detection, including insufficient learning and overadaptation, by striking a balance between acquiring new knowledge and preserving core competencies. We demonstrate significant improvements in detection accuracy, achieving SOTA performance on the MVTec dataset.

Downloads

Published

2025-04-11

How to Cite

Yang, B., Zhang, Z., & Ma, J. (2025). Co-Progression Knowledge Distillation with Knowledge Prototype for Industrial Anomaly Detection. Proceedings of the AAAI Conference on Artificial Intelligence, 39(12), 13008–13016. https://doi.org/10.1609/aaai.v39i12.33419

Issue

Section

AAAI Technical Track on Data Mining & Knowledge Management II