Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

NLP methods and tools > model > representation model > context representation > context-dependent word representation

Preferred term

context-dependent word representation  

Broader concept

Example

  • Context-dependent word representations assign a semantic vector to each word-use within the context of its sentence rather than to each unique word. (Pömsl & Lyapin, 2020)
  • For an OOV word the ELMo layer of ELMoLex computes the context-dependent word representation based on the other vocabulary words present in the focal sentence. (Jawahar, Muller, Fethi, Martin, Villemonte de la Clergerie, Sagot & Seddah, 2018)
  • One way to get context-dependent word representations is using a pretrained neural language model as a feature extractor (Peters et al. 2018). (Pömsl & Lyapin, 2020)
  • Specifically AMTN first utilizes BioBERT as an embedding layer to generate context-dependent word representations. (Zhou, Li, Yao, Lang & Ning, 2019)
  • This problem has been further attenuated by methods based on language model pre-training that produced context-dependent word representations. (Boros, Hamdi, Linhares Pontes, Cabrera-Diego, Moreno, Sidere & Doucet, 2020)

URI

http://data.loterre.fr/ark:/67375/8LP-C19J8C5F-M

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/5/24