Passer au contenu principal

Vocabulary of natural language processing

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

knowledge distillation  

Définition

  • The process of transferring knowledge from a large model to a smaller one. (Wikipedia)

Concept générique

Synonyme(s)

  • model distillation

Contexte(s) définitoire(s)

  • Knowledge distillation is a fine-tuning strategy aiming to transfer knowledge from larger and more complex models into smaller and more practical models. (Maslaris & Arampatzis, 2024)

Exemple

  • Knowledge distillation is trained so that a student model outputs the same output as a teacher model's for one input. (Ahn, Lee, Kim & Oh, 2022)
  • One could similarly to DistilScore perform knowledge distillation but since our initial experiments showed that this doesn't work well with pseudo-parallel data we chose another approach. (Belouadi & Eger, 2023)

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-BBBC55RH-N

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 26/04/2024