Skip to main

Vocabulary of natural language processing

Search from vocabulary

Concept information

Término preferido

long short-term memory  

Definición

  • A type of cell in a recurrent neural network used to process sequences of data in applications such as handwriting recognition, machine translation, and image captioning. LSTMs address the vanishing gradient problem that occurs when training RNNs due to long data sequences by maintaining history in an internal memory state based on new input and context from previous cells in the RNN. ( https://developers.google.com/machine-learning/glossary/#l).

Concepto genérico

Conceptos específicos

Etiquetas alternativas

  • LSTM

Ejemplo

  • Alternative neural network architectures such as recurrent neural networks convolutional neural networks and long short-term memories might yield better results. (Haagsma, 2016)
  • In this article we introduce a long short-term memory (LSTM) network architecture to handle the morphological reinflection task. (Chakrabarty & Garain, 2017)
  • The architecture is built upon a recurrent layer namely a Long Short-Term Memory (LSTM) whose goal is to learn an encoding derived from word embeddings obtained as the output of the recurrent layer at the last timestep. (Menini, Moretti, Corazza, Cabrio, Tonelli & Villata, 2019)

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/8LP-PWM24BM7-M

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 2/5/24