Understanding and Computing Cohen’s Kappa: A Tutorial.
暂无分享,去创建一个
Cohen's Kappa (Cohen, 1960) is an index of interrater reliability that is commonly used to measure the level of agreement between two sets of dichotomous ratings or scores. This tutorial explains the underlying logic of Kappa and shows why it is superior to simple percentage of agreement as a measure of interrater reliability. Examples demonstrate how to calculate Kappa both by hand and with SPSS.
[1] J. Fleiss,et al. Quantification of agreement in psychiatric diagnosis. A new approach. , 1967, Archives of general psychiatry.
[2] Jacob Cohen. A Coefficient of Agreement for Nominal Scales , 1960 .
[3] B. Everitt,et al. Statistical methods for rates and proportions , 1973 .