Concept information
Preferred term
layer normalization
Definition
- A deep learning technique used to normalize the activations of neurons within each layer of a neural network independently in order to stabilize the training process and improve the performance of the neural network. (Based on Mudadla, Layer Normalization, on medium.com, 2023)
Broader concept
Example
- Adding layer normalization after the embedding layer incurs a significant penalty on zero-shot generalization. (Le Scao, Wang, Hesslow, Bekman, Bari, Biderman, Elsahar, Muennighoff, Phang, Press, Raffel, Sanh, Shen, Sutawika, Tae, Yong, Launay & Beltagy, 2022)
- The results show that layer normalization has no visible impact on MNIST. (Aji & Heafield, 2017)
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/8LP-QS6282WL-1
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}