Concept information
Preferred term
BioBERT
Definition
- A pre-trained language representation model for the biomedical domain.'(Lee et al., 2020, p. 1235).
Broader concept
Example
- As a result the pretraining data for BioBERT also covers the biomedical domain. (Salhofer, Liu & Kern, 2022)
- From zero-shot 8 to using all the training data EBM-Net improves only by 26.6% relative F1 (from 47.52% to 60.15%) while BioBERT improves largely by 60.0% relative F1 (from 32.77% to 54.33%). (Jin, Tan, Chen, Liu & Huang, 2020)
- On investigating the error categories of BioBERT (v1.1) models on the clinical language understanding task we find that despite having a strong performance the models still make several mistakes on examples that require medical domain knowledge. (Sushil, Suster & Daelemans, 2021)
- We see that BioBERT does not take age into account when predicting mortality risk except for patients over 90. (Van Aken, Herrmann & Löser, 2022)
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/8LP-PJTL80HD-P
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}