Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Preferred term

self-attention layer  

Definition

  • A layer in the architecture of transformer-based models that allows the model to focus on different parts of the input sequence when processing each token to capture contextual relationships and dependencies within the input sequence.

Broader concept

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-DMDVS16W-4

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/13/24