Hi!

I am currently doing a intercoder reliability test for my phd. I have conducted 17 semi-structured interviews, and segmented/coded them in atlas.ti.

Each code has a upper category it belongs to, for example subcodes “parent”, “uncle”, “child” belong to the upper category “family”. I have allowed sub-codes within the same upper category to overlap while coding, tagging for example both “parent” and “child” in instances where they are both mentioned in the same segment.

I have had a second coder (me being the first) code a sample of the data for a ICR test.

After coding, I realized Cohen’s Kappa and Krippendorff’s Alpha as methods do not allow the usage of more than one code per category for each coded segment. I should have checked this prior to coding, and i now have to work with what i have.

So, I am now asking for advice on which method, or methods to apply in this situation? I am opting to use two, but I don’t know which one’s to use.

I could use a “Fuzzy Kappa” model, which is modelled after Cohen’s Kappa to suit these kinds of overlapping cases. There does not appear to be a standard interpretation of Fuzzy Kappa however. In addition, Cohen’s Kappa in itself still faces some criticism for its shortcomings, so im not completetly sure about this. Another option is for me to calculate a basic cohen’s Kappa on each code separately and then take the average of that, this is an advice I have seen on this forum earlier.

I could also go back to my data and split the codes/ redefine the upper categories to avoid overlap so I could attempt a Krippendorff’s Alhpa test. However, I am not sure if I could get the second coder to come back for splitting their codes, and redefining the categories would be counterintuitive. Krippendorff’s Alpha seems like a more reliable method, but it also has stricter requirements than my resources may take ( requires specific sample sizes, does not allow for the primary researcher/code developer to be one of 2 raters).

Any ideas of which methods to use/ combine are of great help!

Similar questions and discussions