Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

Preferred term

overfitting  

Definition

  • Overfitting occurs when a machine learning model learns to perform exceedingly well on the training data but poorly on new, unseen data. This usually happens when the model memorizes the training data rather than learning the underlying patterns, making it less generalizable to new data. (Ubiquity, Glossary of terms related to Generative artificial intelligence)

Broader concept

Example

  • However NMT models suffer from well-known limitations such as overfitting and moderate generalization particularly when the training data are limited (Koehn and Knowles 2017). (Jauregi Unanue, Parnell & Piccardi, 2021)
  • In addition the word embeddings and regression residuals are regularized by Gaussian priors reducing their chance of overfitting. (Li, Chua, Zhu & Miao, 2016)
  • The reason we have separated the development sets (flipdev1 and flipdev2) is to better avoid potential overfitting. (Uzdilli, Jaggi, Egger, Julmy, Derczynski & Cieliebak, 2015)

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-JJ5RBH9D-4

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 2/12/25