#fleiss'_kappa

Fleiss' kappa

Statistical measure

Fleiss' kappa is a statistical measure for assessing the reliability of agreement between a fixed number of raters when assigning categorical ratings to a number of items or classifying items. This contrasts with other kappas such as Cohen's kappa, which only work when assessing the agreement between not more than two raters or the intra-rater reliability. The measure calculates the degree of agreement in classification over that which would be expected by chance.

Mon 15th

Provided by Wikipedia

Learn More
0 searches
This keyword has never been searched before
This keyword has never been searched for with any other keyword.