Concept information
Preferred term
bidirectional long-short term memory
Definition
- A neural network of Long Short Term Memory which consists of two layers of LSTM neural networks, namely the advanced LSTM layer to model the previous context and the backward LSTM layer to model each subsequent context. (Adapted from Isnain et al., Bidirectional Long Short Term Memory Method and Word2vec Extraction Approach for Hate Speech Detection, IJCCS, 2020)
Broader concept
Synonym(s)
- bi-directional long-short term memory
- BiLSTM
- BLSTM
Example
- ELMo is contextualized word-level embeddings from the language model (LM) based on multiple layers of bidirectional long-short term memories (Bi-LSTMs) (Hochreiter and Schmidhuber 1997). (Hiai, Shimada, Watanabe, Miura & Iwakura, 2021)
- It models a function using character-based word representations and bidirectional long-short term memories of not only each single word but also the entire input sentence. (Oberstrass, Romberg, Stoll & Conrad, 2019)
- We chose this neural-network tagger because it processes both word-and character-level representations automatically using a combination of a bidirectional Long-Short-Term-Memory (LSTM) a Convolutional Neural-Network (CNN) and a Conditional Random Field (CRF). (Faggionato & Meelen, 2019)
- We implement several baseline approaches of conditional random field (CRF) and recent popular state-of-the-art bi-directional long-short term memory (Bi-LSTM) models. (Ali, Lu & Xu, 2020)
- We see that the BLSTM models always perform better than the corresponding UL-STM models as expected. (Lala, Madhyastha & Specia, 2019)
In other languages
-
French
-
biLMCT
-
BLSTM
URI
http://data.loterre.fr/ark:/67375/8LP-Z3GJJPCQ-0
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}