@prefix owl: <http://www.w3.org/2002/07/owl#> .
@prefix skos: <http://www.w3.org/2004/02/skos/core#> .
@prefix dc: <http://purl.org/dc/terms/> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .
@prefix ltk: <http://data.loterre.fr/ark:/67375/LTK> .

<http://data.loterre.fr/ark:/67375/8LP> a owl:Ontology, skos:ConceptScheme .
<http://data.loterre.fr/ark:/67375/8LP-Z3GJJPCQ-0>
  skos:altLabel "bi-directional long-short term memory"@en, "biLMCT"@fr, "BiLSTM"@en, "BLSTM"@en, "BLSTM"@fr ;
  a skos:Concept ;
  skos:prefLabel "bidirectional long-short term memory"@en, "mémoire court et long terme bidirectionnelle"@fr ;
  skos:example "We see that the BLSTM models always perform better than the corresponding UL-STM models as expected. (Lala, Madhyastha & Specia, 2019)"@en, "It models a function using character-based word representations and bidirectional long-short term memories of not only each single word but also the entire input sentence. (Oberstrass, Romberg, Stoll & Conrad, 2019)"@en, "We chose this neural-network tagger because it processes both word-and character-level representations automatically using a combination of a bidirectional Long-Short-Term-Memory (LSTM) a Convolutional Neural-Network (CNN) and a Conditional Random Field (CRF). (Faggionato & Meelen, 2019)"@en, "We implement several baseline approaches of conditional random field (CRF) and recent popular state-of-the-art bi-directional long-short term memory (Bi-LSTM) models. (Ali, Lu & Xu, 2020)"@en, "ELMo is contextualized word-level embeddings from the language model (LM) based on multiple layers of bidirectional long-short term memories (Bi-LSTMs) (Hochreiter and Schmidhuber 1997). (Hiai, Shimada, Watanabe, Miura & Iwakura, 2021)"@en ;
  skos:definition "Réseau récurrent qui traite la donnée courante dans une séquence en tenant compte des données précédentes et des données suivantes. Le plus souvent, il est formé par l'empilement de deux réseaux récurrents LMCT. Un premier réseau LMCT traite les données dans une séquence de gauche à droite et un deuxième réseau LMCT traite les donnée de la séquence de droite à gauche. (Data Franca)"@fr, "A neural network of Long Short Term Memory which consists of two layers of LSTM neural networks, namely the advanced LSTM layer to model the previous context and the backward LSTM layer to model each subsequent context. (Adapted from Isnain et al., Bidirectional Long Short Term Memory Method and Word2vec Extraction Approach for Hate Speech Detection, IJCCS, 2020)"@en ;
  skos:inScheme <http://data.loterre.fr/ark:/67375/8LP> ;
  skos:hiddenLabel "Bidirectional long-short term memory"@en, "Mémoire court et long terme bidirectionnelle"@fr ;
  dc:modified "2024-05-02T12:02:34"^^xsd:dateTime ;
  skos:exactMatch ltk:-BXGJFJWN-Q ;
  skos:broader <http://data.loterre.fr/ark:/67375/8LP-PL8WWPCP-5> .

<http://data.loterre.fr/ark:/67375/8LP-PL8WWPCP-5>
  skos:prefLabel "réseau de neurones récurrents"@fr, "recurrent neural network"@en ;
  a skos:Concept ;
  skos:narrower <http://data.loterre.fr/ark:/67375/8LP-Z3GJJPCQ-0> .

