Skip to main content

Vocabulary of natural language processing

Search from vocabulary

Concept information

Preferred term

autosegmental model  

Broader concept

Example

  • 6 For all languages we get a statistically significant increase in probabilities by adopting the autosegmental model with class nodes and tier-based conditioning. (Futrell, Albright, Graff & O'Donnell, 2017)
  • Existing grammar implementations of tone languages like Chinese (Fang and King 2007) do not appear to make use of autosegmental models either possibly because the assignment of tone in an isolating language is not as intimately connected to inflectional and derivational processes as it is in a morphologically rich language like Hausa. (Crysmann, 2009)
  • For all but one lexicon we find that the autosegmental models do not significantly outperform the N-gram models on artificial data. (Futrell, Albright, Graff & O'Donnell, 2017)
  • For length 3-5 the autosegmental model assigns the highest probabilities followed by the N-gram model and BLICK. (Futrell, Albright, Graff & O'Donnell, 2017)
  • On the other hand if the autosegmental model does better on real data but not artificial data then we can conclude that it is picking up on some real distinctive structure of that data. (Futrell, Albright, Graff & O'Donnell, 2017)

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-XDHV12TQ-Z

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 7/1/24