DistillationAD
Project that regroup the state-of-the-art knowledge distillation approaches for unsupervised anomaly detection
Install / Use
/learn @SimonThomine/DistillationADREADME
Description
This project aims to regroup the state-of-the-art approaches that use knowledge distillation for unsupervised anomaly detection. The code is designed to be understandable and simple to allow custom modifications.
Getting Started
You will need Python 3.10+ and the packages specified in requirements.txt.
Install packages with:
pip install -r requirements.txt
Base usage
Configuration
To use the project, you must configure the config.yaml file This file allows configuring the main elements of the project.
data_path(STR): The path to the datasetdistillType(STR): The type of distillation : st for STPM, rd for reverse distillation, ead for EfficientAD, dbfad for distillation-based fabric anomaly detection, mixed for mixedTeacher, rnst/rnrd for remembering normality (forward/backward), sn for singlenetbackbone(STR): The name of the model backbone (any CNN for st, only resnets and wide resnets for rd, small or medium for ead)out_indice(LIST OF INT): The index of the layer used for distillation (only for st)obj(STR): The object categoryphase(STR): Either train or testsave_path(STR): The path to save the model weightstraining_data(YAML LIST) : To configure hyperparameters (epochs, batch_size, img_size, crop_size, norm and other parameters)
An example of config for each distillType is accessible in configs/
Training and testing
Once configured, just do the following command to train or test (depending of configuration file)
python3 train.py
Implemented methods
STPM : Student-Teacher Feature Pyramid Matching for Unsupervised Anomaly Detection
<p align="left"> <img width="700" height="350" src="images/STPM.png"> </p>Reverse distillation : Anomaly Detection via Reverse Distillation from One-Class Embedding
Article1 and Article2
Code inspiration
EfficientAD : Accurate Visual Anomaly Detection at Millisecond-Level Latencies
<p align="left"> <img width="700" height="200" src="images/EAD.png"> </p>DBFAD : Distillation-based fabric anomaly detection
<p align="left"> <img width="700" height="350" src="images/DistillBased.png"> </p>MixedTeacher : Knowledge Distillation for fast inference textural anomaly detection
<p align="left"> <img width="700" height="400" src="images/DualModel.png"> </p>Remembering Normality: Memory-guided Knowledge Distillation for Unsupervised Anomaly Detection
<p align="left"> <img width="700" height="350" src="images/RememberingNormality.png"> </p>SingleNet Single-Layer Distillation with Fourier Convolutions for Texture Anomaly Detection
WACV 2025, article not published yet
<p align="left"> <img width="700" height="350" src="images/SingleNet.png"> </p>Implemented tools
SSPCAB : Self-Supervised Predictive Convolutional Attentive Block for Anomaly Detection
<p align="left"> <img width="700" height="350" src="images/SSPCAB.png"> </p>License
This project is licensed under the MIT License.
