Concept information
Preferred term
long short-term memory
Definition
- A type of cell in a recurrent neural network used to process sequences of data in applications such as handwriting recognition, machine translation, and image captioning. LSTMs address the vanishing gradient problem that occurs when training RNNs due to long data sequences by maintaining history in an internal memory state based on new input and context from previous cells in the RNN. ( https://developers.google.com/machine-learning/glossary/#l).
Broader concept
Narrower concepts
Synonym(s)
- LSTM
Example
- Alternative neural network architectures such as recurrent neural networks convolutional neural networks and long short-term memories might yield better results. (Haagsma, 2016)
- In this article we introduce a long short-term memory (LSTM) network architecture to handle the morphological reinflection task. (Chakrabarty & Garain, 2017)
- The architecture is built upon a recurrent layer namely a Long Short-Term Memory (LSTM) whose goal is to learn an encoding derived from word embeddings obtained as the output of the recurrent layer at the last timestep. (Menini, Moretti, Corazza, Cabrio, Tonelli & Villata, 2019)
In other languages
-
French
-
LSTM
-
mémoire à long terme par transitions à court terme
URI
http://data.loterre.fr/ark:/67375/8LP-PWM24BM7-M
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}