Concept information
Término preferido
stochastic gradient descent
Definición
- A gradient descent algorithm in which the batch size is one. In other words, SGD trains on a single example chosen uniformly at random from a training set. ( https://developers.google.com/machine-learning/glossary/).
Concepto genérico
Conceptos específicos
Etiquetas alternativas
- SGD
- stochastic gradient algorithm
Ejemplo
- All the language models were trained to minimize the negative log-likelihood of the training data by stochastic gradient algorithms. (Takahashi & Tanaka-Ishii, 2019)
- Standard stochastic gradient descent algorithm is employed to update parameters. (Chen, Sun & Han, 2018)
- We choose stochastic gradient descent algorithm to optimize parameters. (Qian, Sha, Chang, Liu & Zhang, 2017)
- We employ standard stochastic gradient descent algorithm to update the parameters. (An, Bo, Han & Sun, 2019)
- We use the stochastic gradient descent algorithm to resolve the optimization problem and set default values for other learning parameters. (Sun & Xu, 2011)
En otras lenguas
-
francés
-
algorithme du gradient stochastique
URI
http://data.loterre.fr/ark:/67375/8LP-HDSWQT4S-9
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}