Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

Preferred term

semantic embedding  

Broader concept

Example

  • Our method repeats this process from the bottom up to get semantic embeddings at all levels. (Zhao, He, Xiao & Xu, 2023)
  • Sem-BERT is intended to handle multiple sequence inputs the words in the input sequence are passed to semantic role labeling to obtain multiple predicate-derived structures to form a semantic embedding. (Galitsky, Ilvovsky & Goncharova, 2021)
  • Two utterances could be labelled as having the same content if their semantic embeddings are close to each other (e.g. when cosine similarity is above a certain threshold). (Wegmann, Schraagen & Nguyen, 2022)

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-NZRD3L6P-L

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/29/24