Cohen's kappa refers to a calculation that corrects for chance agreement when inter-rater reliability is measured.

Related Articles

Inter-rater reliability at psychology-glossary.com■■■
Inter-rater reliability is defined the degree of agreement between two (2) observers who simultaneously . . . Read More
Accuracy at environment-database.eu■■■
An accuracy is the degree to which a calculation, a measurement, or set of measurements agree with a . . . Read More
Calculation at quality-database.eu■■■
In the context of quality management, "calculation" refers to the process of determining numerical values . . . Read More
Kuder-Richardson formula 20 at psychology-glossary.com■■■
Kuder-Richardson formula 20 refers to a formula for computing split-half reliability that corrects for . . . Read More
Classification at quality-database.eu■■■
Classification: In the context of quality management, classification refers to the process of grouping . . . Read More
Cooperative Agreement at environment-database.eu■■
A Cooperative Agreement is an assistance agreement whereby EPA transfers money, property, services or . . . Read More