Skip to main

Vocabulary of natural language processing

Search from vocabulary

Concept information

Término preferido

overfitting  

Definición

  • Overfitting occurs when a machine learning model learns to perform exceedingly well on the training data but poorly on new, unseen data. This usually happens when the model memorizes the training data rather than learning the underlying patterns, making it less generalizable to new data. (Ubiquity, Glossary of terms related to Generative artificial intelligence)

Concepto genérico

Ejemplo

  • However NMT models suffer from well-known limitations such as overfitting and moderate generalization particularly when the training data are limited (Koehn and Knowles 2017). (Jauregi Unanue, Parnell & Piccardi, 2021)
  • In addition the word embeddings and regression residuals are regularized by Gaussian priors reducing their chance of overfitting. (Li, Chua, Zhu & Miao, 2016)
  • The reason we have separated the development sets (flipdev1 and flipdev2) is to better avoid potential overfitting. (Uzdilli, Jaggi, Egger, Julmy, Derczynski & Cieliebak, 2015)

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-JJ5RBH9D-4

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 12/2/25