Library

StatsTest Blog

Experimental design, data analysis, and statistical tooling for modern teams. No hype, just the math.

Cohen's Kappa
RelationshipJan 29New

Cohen's Kappa

Cohen's Kappa measures inter-rater agreement between two raters classifying items into categories. Use it to quantify how much two raters agree beyond what would be expected by chance.

Krippendorff's Alpha
RelationshipJan 29New

Krippendorff's Alpha

Krippendorff's Alpha measures inter-rater reliability for any number of raters, any number of categories, and any measurement level. Use it as a general-purpose agreement statistic.