Skip to main

Vocabulary of natural language processing

Search from vocabulary

Concept information

Término preferido

knowledge distillation  

Definición

  • The process of transferring knowledge from a large model to a smaller one. (Wikipedia)

Concepto genérico

Etiquetas alternativas

  • model distillation

Contexto(s) definitorio(s)

  • Knowledge distillation is a fine-tuning strategy aiming to transfer knowledge from larger and more complex models into smaller and more practical models. (Maslaris & Arampatzis, 2024)

Ejemplo

  • Knowledge distillation is trained so that a student model outputs the same output as a teacher model's for one input. (Ahn, Lee, Kim & Oh, 2022)
  • One could similarly to DistilScore perform knowledge distillation but since our initial experiments showed that this doesn't work well with pseudo-parallel data we chose another approach. (Belouadi & Eger, 2023)

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-BBBC55RH-N

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 26/4/24