Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Preferred term

cross attention  

Definition

  • An attention mechanism employed in the decoder of transformers that allows the model to consider information from different parts of the input sequence while generating the output sequence. (Based on Bharti, Unraveling Transformers: A Deep Dive into Self-Attention and Cross-Attention Mechanisms, on medium.com, 2024)

Broader concept

Synonym(s)

  • cross-attention mechanism

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-F3BF19DC-2

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/13/24