Passer au contenu principal

Vocabulary of natural language processing

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

overfitting  

Définition

  • Overfitting occurs when a machine learning model learns to perform exceedingly well on the training data but poorly on new, unseen data. This usually happens when the model memorizes the training data rather than learning the underlying patterns, making it less generalizable to new data. (Ubiquity, Glossary of terms related to Generative artificial intelligence)

Concept générique

Exemple

  • However NMT models suffer from well-known limitations such as overfitting and moderate generalization particularly when the training data are limited (Koehn and Knowles 2017). (Jauregi Unanue, Parnell & Piccardi, 2021)
  • In addition the word embeddings and regression residuals are regularized by Gaussian priors reducing their chance of overfitting. (Li, Chua, Zhu & Miao, 2016)
  • The reason we have separated the development sets (flipdev1 and flipdev2) is to better avoid potential overfitting. (Uzdilli, Jaggi, Egger, Julmy, Derczynski & Cieliebak, 2015)

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-JJ5RBH9D-4

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 12/02/2025