Interrater reliability (or Interjudge reliability)  refers to the level of agreement between two (2) or more raters who have evaluated the same individual independently.

Agreement can refer to consensus on behaviors , attributes, and so on. Moreover, it is the degree of agreement among raters about their observations of an individual or individuals.

Other definition:
The level of agreement between at least two raters who have evaluated the same patient independently. Agreement can refer to consensus on symptoms assigned, diagnoses assigned, and so on.

Related Articles

Inter-rater reliability at■■■■■
Inter-rater reliability is defined the degree of agreement between two (2) observers who simultaneously . . . Read More
Precision at■■■■
A precision is the degree to which replicate measurements of the same attribute agree or are exact; - . . . Read More
Behavior at■■■■
Behavior refers to the observable response a person makes to any situation. It also includes the reactions . . . Read More
Exorcism at■■■■
exorcism refers to religious ritual that attributes disordered behavior to possession by demons and seeks . . . Read More
In vivo observation at■■■■
In vivo observation refers to a form of Behavioral Assessment in which the individual is observed in . . . Read More
Adherence at■■■■
Adherence refers to a patient's ability and willingness to follow recommended health practices. - Likewise . . . Read More
Behavioral strategy at■■■■
Behavioral strategy refers to the strategy of defining human problems as behavioral problems. Behavioral . . . Read More
Kappa coefficient at■■■
Kappa coefficient is defined as a statistical index of interrater reliability computed to determine how . . . Read More
Bargaining at■■■
Bargaining is defined as seeking of an agreement through direct negotiation between parties to a conflict. . . . Read More
Consensus at■■■
Consensus refers to a mutually acceptable agreement that integrates the interests of all concerned parties. . . . Read More