Informative knowledge distillation for image anomaly segmentation

Unsupervised anomaly segmentation methods based on knowledge distillation have recently been developed and show superior segmentation performance. However, rare attention has been paid to the overfitting problem caused by the inconsistency between the capacity of the neural network and the amount of knowledge in this scheme. This paper proposes a novel method named Informative Knowledge Distillation (IKD) to address the overfitting problem by increasing knowledge and offering a strong supervisory signal. Technically, a novel Context Similarity Loss (CSL) is proposed to capture context information from normal data manifolds. Besides, a novel Adaptive Hard Sample Mining (AHSM) is proposed to encourage more attention on hard samples with valuable information. With IKD, informative knowledge can be distilled, so that the overfitting problem can be well mitigated, and the performance can be further increased. The proposed method achieves better results on several categories of the well-known MVTec AD dataset than state-of-the-art methods in terms of AU-ROC, achieving 97.81% overall in 15 categories. Extensive experiments on ablation studies are also conducted to show the effectiveness of IKD in alleviating the overfitting problem.

PDF

Datasets


Results from the Paper


Ranked #27 on Anomaly Detection on MVTec AD (Segmentation AUPRO metric)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Anomaly Detection MVTec AD IKD Segmentation AUROC 97.81 # 41
Segmentation AUPRO 92.55 # 27

Methods