TinyBERT4MIND
TinyBERT + MIcrosoft News Dataset
Install / Use
/learn @Jyonn/TinyBERT4MINDREADME
Pretrained Language Model
This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.
Directory structure
- NEZHA-TensorFlow is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks developed by TensorFlow.
- NEZHA-PyTorch is the PyTorch version of NEZHA.
- NEZHA-Gen-TensorFlow is a Chinese GPT-like pretrained language model.
- TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.
- DynaBERT is a dynamic BERT model with adaptive width and depth.
Related Skills
node-connect
334.5kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
82.2kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
334.5kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
82.2kCommit, push, and open a PR
