Passer au contenu principal

Vocabulary of natural language processing

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

long short-term memory  

Définition

  • A type of cell in a recurrent neural network used to process sequences of data in applications such as handwriting recognition, machine translation, and image captioning. LSTMs address the vanishing gradient problem that occurs when training RNNs due to long data sequences by maintaining history in an internal memory state based on new input and context from previous cells in the RNN. ( https://developers.google.com/machine-learning/glossary/#l).

Concept générique

Concepts spécifiques

Synonyme(s)

  • LSTM

Exemple

  • Alternative neural network architectures such as recurrent neural networks convolutional neural networks and long short-term memories might yield better results. (Haagsma, 2016)
  • In this article we introduce a long short-term memory (LSTM) network architecture to handle the morphological reinflection task. (Chakrabarty & Garain, 2017)
  • The architecture is built upon a recurrent layer namely a Long Short-Term Memory (LSTM) whose goal is to learn an encoding derived from word embeddings obtained as the output of the recurrent layer at the last timestep. (Menini, Moretti, Corazza, Cabrio, Tonelli & Villata, 2019)

Traductions

URI

http://data.loterre.fr/ark:/67375/8LP-PWM24BM7-M

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 02/05/2024