Concept information
Preferred term
BlueBERT
Definition
- A BERT language model pre-trained on PubMed abstracts and clinical notes (MIMIC-III). (Loterre)
Broader concept
Example
- For comparison BlueBERT achieved 84-85 for all 3 entities and ClinicalBERT achieved 87. (Abadeer, 2020)
- This is also required for a direct comparison since ClinicalBERT used a cased corpus while BlueBERT used an uncased one. (Abadeer, 2020)
- We evaluate BlueBERT on the N2C2STS and train it with 5-fold cross-validation. (Xiong, Yang, Liu, Wong, Chen, Xiang & Tang, 2023)
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/8LP-LHVS1Q6T-1
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}