Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

Preferred term

knowledge distillation  

Definition

  • The process of transferring knowledge from a large model to a smaller one. (Wikipedia)

Broader concept

Synonym(s)

  • model distillation

Definitional context(s)

  • Knowledge distillation is a fine-tuning strategy aiming to transfer knowledge from larger and more complex models into smaller and more practical models. (Maslaris & Arampatzis, 2024)

Example

  • Knowledge distillation is trained so that a student model outputs the same output as a teacher model's for one input. (Ahn, Lee, Kim & Oh, 2022)
  • One could similarly to DistilScore perform knowledge distillation but since our initial experiments showed that this doesn't work well with pseudo-parallel data we chose another approach. (Belouadi & Eger, 2023)

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-BBBC55RH-N

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 4/26/24