Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

Preferred term

subspace  

Broader concept

Example

  • For self-attention and encoder-decoder attention a multi-head attention block is used to obtain information from different representation subspaces at different positions. (Zhu, Wang, Wang, Zhou, Zhang, Wang & Zong, 2019)
  • Further due to the poor representation of non-binary pronouns the subspace is likely representing the difference in frequency of terms rather than the concept of gender as a whole. (Dev, Monajatipoor, Ovalle, Subramonian, Phillips & Chang, 2021)

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-SRM3B6WR-X

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/28/24