Skip to main

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Término preferido

self-attention layer  

Definición

  • A layer in the architecture of transformer-based models that allows the model to focus on different parts of the input sequence when processing each token to capture contextual relationships and dependencies within the input sequence.

Concepto genérico

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-DMDVS16W-4

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 13/5/24