Concept information
Preferred term
cross attention
Definition
- An attention mechanism employed in the decoder of transformers that allows the model to consider information from different parts of the input sequence while generating the output sequence. (Based on Bharti, Unraveling Transformers: A Deep Dive into Self-Attention and Cross-Attention Mechanisms, on medium.com, 2024)
Broader concept
Synonym(s)
- cross-attention mechanism
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/8LP-F3BF19DC-2
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}