Passer au contenu principal

Vocabulary of natural language processing

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

Cohen's kappa  

Définition

Concept générique

Synonyme(s)

  • Cohen's kappa coefficient
  • Cohen's Kappa score

Contexte(s) définitoire(s)

  • Cohen's kappa coefficient is a statistic that is used to measure inter-rater reliability for categorical items. (Jamali, Yaghoobzadeh & Faili, 2022)

Exemple

  • Cohen's Kappa scores (Cohen 1960) for POS tags and dependency labels in all evaluation conditions are above 0.96. (Berzak, Kenney, Spadine, Wang, Lam, Mori, Garza & Katz, 2016)
  • Note that computing Cohen's Kappa here is somewhat artificial as in the real world there is an (almost) unlimited space of possible Wikipedia identifiers. (Lin & Zeldes, 2021)
  • The Cohen's kappa coefficients are above 0.7 indicating a high correlation and agreement between the two human annotators. (Li, Zhao, Wen & Song, 2019)
  • To evaluate the annotator's performance on this task we evaluate the inter-annotator agreement (IAA) for each of the 13 topics using Cohen's Kappa (CK) score. (Dalal, Srivastava & Singh, 2023)
  • We computed the inter-annotator agreement in terms of Cohen's Kappa (κ) score (Cohen 1960) to check the validity. (Hossain, Sharif, Hoque & Preum, 2024)

Traductions

  • français

  • coefficient de kappa
  • coefficient Kappa
  • coefficient Kappa de Cohen
  • méthode du Kappa
  • κ de Cohen

URI

http://data.loterre.fr/ark:/67375/8LP-N657SLSC-M

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 26/06/2024