Passer au contenu principal

Vocabulary of natural language processing

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

BioBERT  

Définition

  • A pre-trained language representation model for the biomedical domain.'(Lee et al., 2020, p. 1235).

Concept générique

Exemple

  • As a result the pretraining data for BioBERT also covers the biomedical domain. (Salhofer, Liu & Kern, 2022)
  • From zero-shot 8 to using all the training data EBM-Net improves only by 26.6% relative F1 (from 47.52% to 60.15%) while BioBERT improves largely by 60.0% relative F1 (from 32.77% to 54.33%). (Jin, Tan, Chen, Liu & Huang, 2020)
  • On investigating the error categories of BioBERT (v1.1) models on the clinical language understanding task we find that despite having a strong performance the models still make several mistakes on examples that require medical domain knowledge. (Sushil, Suster & Daelemans, 2021)
  • We see that BioBERT does not take age into account when predicting mortality risk except for patients over 90. (Van Aken, Herrmann & Löser, 2022)

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-PJTL80HD-P

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 26/04/2024