Concept information
Preferred term
attention weight
Definition
- The degree of relevance or importance assigned to each word or token in a sequence when processing a task. Attention weights help models focus on pertinent information while performing tasks such as machine translation, text summarization, and question answering, allowing them to efficiently process and generate meaningful outputs.
Broader concept
Synonym(s)
- attention matrix
Example
- Recently NLP practitioners have focused on using attention weights as explanatory tools. (Vafa, Deng, Blei & Rush, 2021)
- Subsequently in the second stage the attention weights are updated to minimize the model's validation loss. (Somayajula, Liang, Zhang, Singh & Xie, 2024)
In other languages
-
French
-
poids d'attention
URI
http://data.loterre.fr/ark:/67375/8LP-HK0MKFRQ-T
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}