#cohen's_kappa

Cohen's kappa

Statistic measuring inter-rater agreement for categorical items

Cohen's kappa coefficient is a statistic that is used to measure inter-rater reliability for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, as κ takes into account the possibility of the agreement occurring by chance. There is controversy surrounding Cohen's kappa due to the difficulty in interpreting indices of agreement. Some researchers have suggested that it is conceptually simpler to evaluate disagreement between items.

Thu 12th

Provided by Wikipedia

Learn More
0 searches
This keyword has never been searched before
This keyword has never been searched for with any other keyword.