Concept information
Término preferido
semantic embedding
Concepto genérico
Ejemplo
- Our method repeats this process from the bottom up to get semantic embeddings at all levels. (Zhao, He, Xiao & Xu, 2023)
- Sem-BERT is intended to handle multiple sequence inputs the words in the input sequence are passed to semantic role labeling to obtain multiple predicate-derived structures to form a semantic embedding. (Galitsky, Ilvovsky & Goncharova, 2021)
- Two utterances could be labelled as having the same content if their semantic embeddings are close to each other (e.g. when cosine similarity is above a certain threshold). (Wegmann, Schraagen & Nguyen, 2022)
En otras lenguas
URI
http://data.loterre.fr/ark:/67375/8LP-NZRD3L6P-L
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}