Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Preferred term

BERT  

Definition

  • BERT is designed to pretrain deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. (Devlin et al., 2019).

Broader concept

Synonym(s)

  • Bidirectional Encoder Representations from Transformers

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-NBH731S9-G

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 4/26/24