Concept information
Terme préférentiel
NLU task
Concept générique
Synonyme(s)
- natural language understanding task
Exemple
- Finally the experimental results show that even though we utilize much less external resources our model achieves better adaptation performance for natural language understanding task (i.e. the intent detection and slot filling) compared to the current state-of-the-art model in the zero-shot scenario. (Liu, Shin, Xu, Winata, Xu, Madotto & Fung, 2019)
- In this paper we experimented with consistency training in a major NLU task: Domain Classification (DC). (Leung & Tan, 2022)
- Most recent benchmarks propose a representative set of standard NLU tasks for evaluation. (Elmadany, Nagoudi & Abdul-Mageed, 2023)
- We also see improvements of downstream NLU tasks by applying paraphrases to data augmentation. (Yu, Arkoudas & Hamza, 2020)
- We present the first study of catastrophic forgetting in a massively multilingual setting involving up to 51 languages on named entity recognition and natural language understanding tasks. (Winata, Xie, Radhakrishnan, Wu, Jin, Cheng, Kulkarni & Preotiuc-Pietro, 2023)
Traductions
-
français
URI
http://data.loterre.fr/ark:/67375/8LP-C71CCN8S-H
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}