Concept information
Preferred term
overfitting
Definition
- Overfitting occurs when a machine learning model learns to perform exceedingly well on the training data but poorly on new, unseen data. This usually happens when the model memorizes the training data rather than learning the underlying patterns, making it less generalizable to new data. (Ubiquity, Glossary of terms related to Generative artificial intelligence)
Broader concept
Example
- However NMT models suffer from well-known limitations such as overfitting and moderate generalization particularly when the training data are limited (Koehn and Knowles 2017). (Jauregi Unanue, Parnell & Piccardi, 2021)
- In addition the word embeddings and regression residuals are regularized by Gaussian priors reducing their chance of overfitting. (Li, Chua, Zhu & Miao, 2016)
- The reason we have separated the development sets (flipdev1 and flipdev2) is to better avoid potential overfitting. (Uzdilli, Jaggi, Egger, Julmy, Derczynski & Cieliebak, 2015)
In other languages
-
French
-
sur-apprentissage
URI
http://data.loterre.fr/ark:/67375/8LP-JJ5RBH9D-4
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}