Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

Preferred term

long short-term memory  

Definition

  • A type of cell in a recurrent neural network used to process sequences of data in applications such as handwriting recognition, machine translation, and image captioning. LSTMs address the vanishing gradient problem that occurs when training RNNs due to long data sequences by maintaining history in an internal memory state based on new input and context from previous cells in the RNN. ( https://developers.google.com/machine-learning/glossary/#l).

Broader concept

Synonym(s)

  • LSTM

Example

  • Alternative neural network architectures such as recurrent neural networks convolutional neural networks and long short-term memories might yield better results. (Haagsma, 2016)
  • In this article we introduce a long short-term memory (LSTM) network architecture to handle the morphological reinflection task. (Chakrabarty & Garain, 2017)
  • The architecture is built upon a recurrent layer namely a Long Short-Term Memory (LSTM) whose goal is to learn an encoding derived from word embeddings obtained as the output of the recurrent layer at the last timestep. (Menini, Moretti, Corazza, Cabrio, Tonelli & Villata, 2019)

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-PWM24BM7-M

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 5/2/24