Concept information
Preferred term
mBART
Definition
- A sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the BART objective. (Liu et al., 2020)
Broader concept
Example
- Specifically we augment the original vocabulary of mBART with the names of AMR relations and frames occurring at least 5 times in the gold training corpus. (Cai, Li, Ho, Bing & Lam, 2021)
- The noise function of mBART replaces text spans of arbitrary length with a mask token (35% of the words in each instance) and permutes the order of sentences. (Kasner & Dušek, 2020)
- Tran et al. (2020) show that fine-tuning mBART using pseudo-parallel data leads to very promising results so we use mBART for our own experiments as well. (Belouadi & Eger, 2023)
- We base our approach on the mBART model which is pre-trained for multilingual denoising. (Kasner & Dušek, 2020)
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/8LP-PDSXQ6Q0-M
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}