Concept information
Terme préférentiel
multi-head attention
Définition
- An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)
Concept générique
Exemple
- Further analyses show that our multi-head attention is able to attend information from various aspects and boost classification or generation in diverse scenarios. (Wang, Li, Lyu & King, 2020)
- Moreover we propose a novel Multi-Modality Multi-Head Attention to capture the dense interactions between texts and images where image wordings explicit in optical characters and implicit in image attributes are further exploited to bridge their semantic gap. (Wang, Li, Lyu & King, 2020)
- Then we apply multi-head attention between the output of those attention layers expecting that the source context helps the decoder to recognize those common words which should be remained in post-edited sentence. (Shin & Lee, 2018)
Traductions
-
français
URI
http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}