Passer au contenu principal

Vocabulary of natural language processing

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

multi-head attention  

Définition

  • An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)

Concept générique

Exemple

  • Further analyses show that our multi-head attention is able to attend information from various aspects and boost classification or generation in diverse scenarios. (Wang, Li, Lyu & King, 2020)
  • Moreover we propose a novel Multi-Modality Multi-Head Attention to capture the dense interactions between texts and images where image wordings explicit in optical characters and implicit in image attributes are further exploited to bridge their semantic gap. (Wang, Li, Lyu & King, 2020)
  • Then we apply multi-head attention between the output of those attention layers expecting that the source context helps the decoder to recognize those common words which should be remained in post-edited sentence. (Shin & Lee, 2018)

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 21/05/2024