SkillAgentSearch skills...

DLB

Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"

Install / Use

/learn @Meta-knowledge-Lab/DLB
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Self-Distillation from the Last Mini-Batch (DLB)

This is a pytorch implementation for "Self-Distillation from the Last Mini-Batch for Consistency Regularization". The paper was accepted by CVPR 2022.

The paper is available at https://arxiv.org/abs/2203.16172.

Run dlb.py for the proposed self distillation method.

Related Skills

View on GitHub
GitHub Stars44
CategoryDevelopment
Updated1mo ago
Forks4

Languages

Python

Security Score

90/100

Audited on Mar 1, 2026

No findings