Skip to main content

ThesoTM thesaurus

Search from vocabulary

Concept information

Preferred term

TinyBERT  

Definition

  • "TinyBERT is 7.5x smaller and 9.4x faster on inference than BERT-base and achieves competitive performances in the tasks of natural language understanding. It performs a novel transformer distillation at both the pre-training and task-specific learning stages." (source: https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/TinyBERT).

Broader concept

Bibliographic citation(s)

  • • Jiao, X., Yin, Y., Shang, L., Jiang, X., Chen, X., Li, L., Wang, F., & Liu, Q. (2020). TinyBERT : Distilling BERT for natural language understanding. arXiv:1909.10351 [cs]. http://arxiv.org/abs/1909.10351

based on

has application field

has design country

  • China

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-LKG67B9D-D

Download this concept: