Cohen's kappa refers to a calculation that corrects for chance agreement when inter-rater reliability is measured.

Related Articles

Interrater reliability at psychology-glossary.com■■■
Interrater reliability (or Interjudge reliability) refers to the level of agreement between two (2) or . . . Read More
Inter-rater reliability at psychology-glossary.com■■■
Inter-rater reliability is defined the degree of agreement between two (2) observers who simultaneously . . . Read More
Accuracy at environment-database.eu■■■
An accuracy is the degree to which a calculation, a measurement, or set of measurements agree with a . . . Read More
Calculation at quality-database.eu■■■
In the context of quality management, "calculation" refers to the process of determining numerical values . . . Read More
Arms control at environment-database.eu■■
Arms control: An arms control is coordinated action based on agreements to limit, regulate, or reduce . . . Read More