Generating and Reweighting Dense Contrastive Patterns for Unsupervised Anomaly Detection
DOI:
https://doi.org/10.1609/aaai.v38i2.27910Keywords:
CV: Object Detection & Categorization, CV: Applications, ML: Unsupervised & Self-Supervised LearningAbstract
Recent unsupervised anomaly detection methods often rely on feature extractors pretrained with auxiliary datasets or on well-crafted anomaly-simulated samples. However, this might limit their adaptability to an increasing set of anomaly detection tasks due to the priors in the selection of auxiliary datasets or the strategy of anomaly simulation. To tackle this challenge, we first introduce a prior-less anomaly generation paradigm and subsequently develop an innovative unsupervised anomaly detection framework named GRAD, grounded in this paradigm. GRAD comprises three essential components: (1) a diffusion model (PatchDiff) to generate contrastive patterns by preserving the local structures while disregarding the global structures present in normal images, (2) a self-supervised reweighting mechanism to handle the challenge of long-tailed and unlabeled contrastive patterns generated by PatchDiff, and (3) a lightweight patch-level detector to efficiently distinguish the normal patterns and reweighted contrastive patterns. The generation results of PatchDiff effectively expose various types of anomaly patterns, e.g. structural and logical anomaly patterns. In addition, extensive experiments on both MVTec AD and MVTec LOCO datasets also support the aforementioned observation and demonstrate that GRAD achieves competitive anomaly detection accuracy and superior inference speed.Downloads
Published
2024-03-24
How to Cite
Dai, S., Wu, Y., Li, X., & Xue, X. (2024). Generating and Reweighting Dense Contrastive Patterns for Unsupervised Anomaly Detection. Proceedings of the AAAI Conference on Artificial Intelligence, 38(2), 1454-1462. https://doi.org/10.1609/aaai.v38i2.27910
Issue
Section
AAAI Technical Track on Computer Vision I