In my research I use thematic content analysis to describe legal documents. After building our coding framework, we have assigned a sample of the data between 3 independent raters. I have asked them to code the data and upon receiving their codes back, I calculated the score by dividing the total number of codes present with the matching score. The simple percentage score between the 3 raters is 88%. I tried to calculate Krippendorff's Alpha using Recal3 (online), Real statistics (Add-in Excel) and with a R Script (icr package in order to calculate Krippendorff's alpha).
I always get aberrant data : -0,46 or -0,26 or a really low Kalpha like 0,10 or 0,11.
Why do you think this happens even if I have 88% of agreement?
My tables are composed of 3 rows (corresponding to the raters) and 26 columns (corresponding to the codes). The data is nominal, it only reflects the presence (1) or absence (0) of the category for the result of each encoder. The table is in the attached file.
Thank you