Concept information
Terme préférentiel
knowledge distillation
Définition
- The process of transferring knowledge from a large model to a smaller one. (Wikipedia)
Concept générique
Synonyme(s)
- model distillation
Contexte(s) définitoire(s)
- Knowledge distillation is a fine-tuning strategy aiming to transfer knowledge from larger and more complex models into smaller and more practical models. (Maslaris & Arampatzis, 2024)
Exemple
- Knowledge distillation is trained so that a student model outputs the same output as a teacher model's for one input. (Ahn, Lee, Kim & Oh, 2022)
- One could similarly to DistilScore perform knowledge distillation but since our initial experiments showed that this doesn't work well with pseudo-parallel data we chose another approach. (Belouadi & Eger, 2023)
Traductions
-
français
URI
http://data.loterre.fr/ark:/67375/8LP-BBBC55RH-N
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}