SkillAgentSearch skills...

DistillationAD

Project that regroup the state-of-the-art knowledge distillation approaches for unsupervised anomaly detection

Install / Use

/learn @SimonThomine/DistillationAD
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<p align="center"> <h1><center> &#127981;&#9879; Distillation Industrial Anomaly Detection &#9879;&#127981; </center></h1> </p>

Description

This project aims to regroup the state-of-the-art approaches that use knowledge distillation for unsupervised anomaly detection. The code is designed to be understandable and simple to allow custom modifications.

Getting Started

You will need Python 3.10+ and the packages specified in requirements.txt.

Install packages with:

pip install -r requirements.txt

Base usage

Configuration

To use the project, you must configure the config.yaml file This file allows configuring the main elements of the project.

  • data_path (STR): The path to the dataset
  • distillType (STR): The type of distillation : st for STPM, rd for reverse distillation, ead for EfficientAD, dbfad for distillation-based fabric anomaly detection, mixed for mixedTeacher, rnst/rnrd for remembering normality (forward/backward), sn for singlenet
  • backbone (STR): The name of the model backbone (any CNN for st, only resnets and wide resnets for rd, small or medium for ead)
  • out_indice (LIST OF INT): The index of the layer used for distillation (only for st)
  • obj (STR): The object category
  • phase (STR): Either train or test
  • save_path (STR): The path to save the model weights
  • training_data(YAML LIST) : To configure hyperparameters (epochs, batch_size, img_size, crop_size, norm and other parameters)

An example of config for each distillType is accessible in configs/

Training and testing

Once configured, just do the following command to train or test (depending of configuration file)

python3 train.py

Implemented methods

STPM : Student-Teacher Feature Pyramid Matching for Unsupervised Anomaly Detection

Article
Code inspiration

<p align="left"> <img width="700" height="350" src="images/STPM.png"> </p>

Reverse distillation : Anomaly Detection via Reverse Distillation from One-Class Embedding

Article1 and Article2
Code inspiration

<p align="left"> <img width="700" height="350" src="images/RD.png"> </p>

EfficientAD : Accurate Visual Anomaly Detection at Millisecond-Level Latencies

Article
Code inspiration

<p align="left"> <img width="700" height="200" src="images/EAD.png"> </p>

DBFAD : Distillation-based fabric anomaly detection

Article
Code inspiration

<p align="left"> <img width="700" height="350" src="images/DistillBased.png"> </p>

MixedTeacher : Knowledge Distillation for fast inference textural anomaly detection

Article
Code inspiration

<p align="left"> <img width="700" height="400" src="images/DualModel.png"> </p>

Remembering Normality: Memory-guided Knowledge Distillation for Unsupervised Anomaly Detection

Article
Code inspiration

<p align="left"> <img width="700" height="350" src="images/RememberingNormality.png"> </p>

SingleNet Single-Layer Distillation with Fourier Convolutions for Texture Anomaly Detection

WACV 2025, article not published yet

<p align="left"> <img width="700" height="350" src="images/SingleNet.png"> </p>

Implemented tools

SSPCAB : Self-Supervised Predictive Convolutional Attentive Block for Anomaly Detection

Article
Code inspiration

<p align="left"> <img width="700" height="350" src="images/SSPCAB.png"> </p>

License

This project is licensed under the MIT License.

View on GitHub
GitHub Stars15
CategoryDevelopment
Updated4d ago
Forks5

Languages

Python

Security Score

75/100

Audited on Mar 23, 2026

No findings