Skip to main content

ThesoTM (thésaurus)

Search from vocabulary

Concept information

Preferred term

TinyBERT  

Definition

  • « TinyBERT est 7,5 fois plus petit et 9,4 fois plus rapide que la base BERT pour l'inférence. » (source : https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/TinyBERT).

Broader concept

Bibliographic citation(s)

  • • Jiao, X., Yin, Y., Shang, L., Jiang, X., Chen, X., Li, L., Wang, F., & Liu, Q. (2020). TinyBERT : Distilling BERT for natural language understanding. arXiv:1909.10351 [cs]. http://arxiv.org/abs/1909.10351

based on

has application field

has design country

  • Chine

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-LKG67B9D-D

Download this concept: