Customers turn to us to understand NVivo better!
Read more here (NVivo partner 2011-2021)
Here we give a generic outline of how you can compare coding between two colleagues.
A Coding Comparison query enables you to compare coding done by two users or two groups of users.
It provides two ways of measuring 'inter-rater reliability' or the degree of agreement between the users: through the calculation of the percentage agreement and 'Kappa coefficient'.
- Percentage agreement is the number of units of agreement divided by the total units of measure within the data item, displayed as a percentage.
- Kappa coefficient is a statistical measure which takes into account the amount of agreement that could be expected to occur through chance.
To start a coding comparison
On the Explore tab, in the Query group, click Coding Comparison, see below.
The Coding Comparison Query dialogue box opens, see below.
-
Select the users to include in user groups A and B
-
Select to choose specific nodes or nodes in selected sets, classifications or Search Folders
-
In the Scope box, click Select to choose specific sources
-
Select what you want to include in the results:
-
Select Display Kappa Coefficient to show this in the result.
-
Select Display percentage agreement to show this in the result.
Coding comparison result
-
The node that contains the coding that is being compared.
-
The source name and source folder location.
-
The Kappa coefficient (only if you select Display Kappa Coefficient).
-
If the users are in complete agreement then the Kappa coefficient (K) = 1.
-
If there is no agreement among the raters (other than what would be expected by chance) then the Kappa coefficient (K) ≤ 0.
-
The green columns show percentage agreement
-
The red columns show percentage disagreement
Notice
To save the Coding Comparison query, select the Add to Project checkbox and enter the name and description (optional) in the General tab.
You can read more about coding comparison here.