Concept information
Preferred term
SapBERT
Definition
- « modèle pré-entraîné qui auto-aligne l'espace de représentation d'entités biomédicales. » (Liu et al., 2021, p. 4228).
Broader concept
Bibliographic citation(s)
- • Liu, F., Shareghi, E., Meng, Z., Basaldella, M., & Collier, N. (2021). Self-alignment pretraining for biomedical entity representations. In K. Toutanova, A. Rumshisky, L. Zettlemoyer, D. Hakkani-Tur, I. Beltagy, S. Bethard, R. Cotterell, T. Chakraborty, & Y. Zhou (Eds.), Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 4228–4238). Association for Computational Linguistics. doi:10.18653/v1/2021.naacl-main.334
based on
has design country
- États-Unis
- Royaume-Uni
has for input language
has repository
is an application of
implements
is encoded in
has for license
In other languages
-
English
-
Self-aligning pretrained BERT
URI
http://data.loterre.fr/ark:/67375/LTK-W63LC46L-T
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}