Concept information
Preferred term
reward function
Definition
- In reinforcement learning, a scoring mechanism that evaluates the performance of a model based on its outputs in a given task.
Broader concept
In other languages
-
French
-
récompense
URI
http://data.loterre.fr/ark:/67375/8LP-G0GNG4LX-B
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}