Customers turn to us to understand NVivo better!
Read more here (NVivo partner 2011-2024)
Here we give a generic outline of how you can compare coding between colleagues.
The first option is by comparing the coding using the coding stripes (only in Windows), the second option is by running a coding comparison query.
1) Compare your coding using coding stripes
Coding stripes are colored bars that show you the codes that code the content you are viewing. If several users have been coding in the same NVivo project, it will be possible to compare the coding visually using the coding stripes.
Show coding stripes for specific users
1) Open the file, code, case or memo in the Detail View
2) On the ribbon tab for the item, click Coding Stripes and then choose the option:
-
-
Selected Items— choose the following:
- Users. Show a coding stripe for each user to compare patterns of coding by team members.
-
Selected Items— choose the following:
You can also find out information about the users who did the coding, by hovering over the coding stripe to see the user's initials, or showing sub-stripes (only in Windows). This way, you can easily see who coded what parts in a document.
2) Compare your coding using the coding comparison query
A Coding Comparison query enables you to compare coding done by two users or two groups of users.
It provides two ways of measuring 'inter-rater/inter-coder reliability' or the degree of agreement between the users: through the calculation of the percentage agreement and 'Kappa coefficient'.
- Percentage agreement is the number of units of agreement divided by the total units of measure within the data item, displayed as a percentage.
- Kappa coefficient is a statistical measure which takes into account the amount of agreement that could be expected to occur through chance.
To start a coding comparison
On the Explore tab, in the Query group, click Coding Comparison, see below.
The Coding Comparison Query dialogue box opens, see below.
-
Select the users to include in user groups A and B
-
Select to choose specific codes or codes in selected sets, classifications or Search Folders
-
In the Scope box, click Select to choose specific sources
-
Select what you want to include in the results:
-
Select Display Kappa Coefficient to show this in the result.
-
Select Display percentage agreement to show this in the result.
Coding comparison result
-
The code that contains the coding that is being compared.
-
The source name and source folder location.
-
The Kappa coefficient (only if you select Display Kappa Coefficient).
-
If the users are in complete agreement then the Kappa coefficient (K) = 1.
-
If there is no agreement among the raters (other than what would be expected by chance) then the Kappa coefficient (K) ≤ 0.
-
The green columns show percentage agreement
-
The red columns show percentage disagreement
Notice
To save the Coding Comparison query, directly select the Add to Project checkbox in the dialogue box and enter the name and description (optional) in the General tab.
You can read more about coding comparison here.