Concept information
Preferred term
context-dependent word representation
Broader concept
Example
- Context-dependent word representations assign a semantic vector to each word-use within the context of its sentence rather than to each unique word. (Pömsl & Lyapin, 2020)
- For an OOV word the ELMo layer of ELMoLex computes the context-dependent word representation based on the other vocabulary words present in the focal sentence. (Jawahar, Muller, Fethi, Martin, Villemonte de la Clergerie, Sagot & Seddah, 2018)
- One way to get context-dependent word representations is using a pretrained neural language model as a feature extractor (Peters et al. 2018). (Pömsl & Lyapin, 2020)
- Specifically AMTN first utilizes BioBERT as an embedding layer to generate context-dependent word representations. (Zhou, Li, Yao, Lang & Ning, 2019)
- This problem has been further attenuated by methods based on language model pre-training that produced context-dependent word representations. (Boros, Hamdi, Linhares Pontes, Cabrera-Diego, Moreno, Sidere & Doucet, 2020)
In other languages
URI
http://data.loterre.fr/ark:/67375/8LP-C19J8C5F-M
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}