SkillAgentSearch skills...

TinyBERT4MIND

TinyBERT + MIcrosoft News Dataset

Install / Use

/learn @Jyonn/TinyBERT4MIND
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • NEZHA-TensorFlow is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks developed by TensorFlow.
  • NEZHA-PyTorch is the PyTorch version of NEZHA.
  • NEZHA-Gen-TensorFlow is a Chinese GPT-like pretrained language model.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.
  • DynaBERT is a dynamic BERT model with adaptive width and depth.

Related Skills

View on GitHub
GitHub Stars6
CategoryDevelopment
Updated1y ago
Forks1

Languages

Python

Security Score

55/100

Audited on Aug 8, 2024

No findings