189k views
1 vote
Equivalence: (Interrater Reliability) Cohen's Kappa Statistic.

a) Measures internal consistency
b) Assesses interrater reliability
c) Determines sample size
d) Examines population distribution

User Anarcat
by
7.3k points

1 Answer

7 votes

Final answer:

Cohen's Kappa Statistic is a measure of interrater reliability that assesses the degree of agreement between raters.

Step-by-step explanation:

Cohen's Kappa Statistic is a measure of interrater reliability, which assesses the degree of agreement between two or more raters when they are evaluating the same data.

For example, if two doctors are reviewing the same set of medical records and they both come to similar conclusions, the interrater reliability is high. On the other hand, if the doctors have different diagnoses for the same patients, the interrater reliability is low.

Cohen's Kappa Statistic is calculated by comparing the observed agreement between raters to the agreement that would be expected by chance alone. It ranges from -1 to 1, with values close to 1 indicating high interrater reliability.

User Samvel Kartashyan
by
7.5k points