Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Preferred term

multi-head attention  

Definition

  • An attention mechanism used in NLP tasks such as machine translation and summarization that involves dividing the input sequence into multiple heads and combining their outputs to produce a single output, allowing the attention mechanism to focus on different aspects and relationships in the input sequence. (Multi Head Attention, on medium.com, 2023)

Broader concept

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-G68PZTML-J

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/21/24