DLB
Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
Install / Use
/learn @Meta-knowledge-Lab/DLBREADME
Self-Distillation from the Last Mini-Batch (DLB)
This is a pytorch implementation for "Self-Distillation from the Last Mini-Batch for Consistency Regularization". The paper was accepted by CVPR 2022.
The paper is available at https://arxiv.org/abs/2203.16172.
Run dlb.py for the proposed self distillation method.
Related Skills
node-connect
345.9kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
106.4kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
345.9kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
345.9kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
