Skip to main

Vocabulary of natural language processing

Search from vocabulary

Concept information

Término preferido

BioBERT  

Definición

  • A pre-trained language representation model for the biomedical domain.'(Lee et al., 2020, p. 1235).

Concepto genérico

Ejemplo

  • As a result the pretraining data for BioBERT also covers the biomedical domain. (Salhofer, Liu & Kern, 2022)
  • From zero-shot 8 to using all the training data EBM-Net improves only by 26.6% relative F1 (from 47.52% to 60.15%) while BioBERT improves largely by 60.0% relative F1 (from 32.77% to 54.33%). (Jin, Tan, Chen, Liu & Huang, 2020)
  • On investigating the error categories of BioBERT (v1.1) models on the clinical language understanding task we find that despite having a strong performance the models still make several mistakes on examples that require medical domain knowledge. (Sushil, Suster & Daelemans, 2021)
  • We see that BioBERT does not take age into account when predicting mortality risk except for patients over 90. (Van Aken, Herrmann & Löser, 2022)

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-PJTL80HD-P

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 26/4/24