Cohen's Kappa is calculated using the following formula:
\[ \kappa = \frac{P_o - P_e}{1 - P_e} \]
Where: - \( P_o \) is the observed agreement among raters. - \( P_e \) is the expected agreement by chance.
The value of Cohen's Kappa ranges from -1 to 1: - 1 indicates perfect agreement. - 0 indicates no agreement beyond chance. - -1 indicates perfect disagreement.