Journal of Mathematics (Jan 2014)

New Interpretations of Cohen’s Kappa

  • Matthijs J. Warrens

DOI
https://doi.org/10.1155/2014/203907
Journal volume & issue
Vol. 2014

Abstract

Read online

Cohen’s kappa is a widely used association coefficient for summarizing interrater agreement on a nominal scale. Kappa reduces the ratings of the two observers to a single number. With three or more categories it is more informative to summarize the ratings by category coefficients that describe the information for each category separately. Examples of category coefficients are the sensitivity or specificity of a category or the Bloch-Kraemer weighted kappa. However, in many research studies one is often only interested in a single overall number that roughly summarizes the agreement. It is shown that both the overall observed agreement and Cohen’s kappa are weighted averages of various category coefficients and thus can be used to summarize these category coefficients.