Skip to main

Vocabulary of natural language processing

Search from vocabulary

Concept information

Término preferido

multi-head attention  

Definición

  • An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)

Concepto genérico

Ejemplo

  • Further analyses show that our multi-head attention is able to attend information from various aspects and boost classification or generation in diverse scenarios. (Wang, Li, Lyu & King, 2020)
  • Moreover we propose a novel Multi-Modality Multi-Head Attention to capture the dense interactions between texts and images where image wordings explicit in optical characters and implicit in image attributes are further exploited to bridge their semantic gap. (Wang, Li, Lyu & King, 2020)
  • Then we apply multi-head attention between the output of those attention layers expecting that the source context helps the decoder to recognize those common words which should be remained in post-edited sentence. (Shin & Lee, 2018)

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 21/5/24