Tags:
create new tag
, view all tags, tagging instructions

Overall Percent Agreement

A measure of agreement between raters. To calculate the overall percent agreement we take the total number of times in which the raters agree and divide that by the total number of readings/classifications made.

There are two problems with this estimate.

  1. Often times one category is more clear than others. Thus there is likely to be considerable agreement between the two observers about subjects in this category. Therefore the percent agreement may be high only because of the large number of subjects clearly in this category. A correction for this problem is to calculate the percent agreement without the number of times the raters agree in this particular category.
  2. Another problem is that the procedure does not take into account that the raters are expected to agree solely by chance. A correction for this problem is to use a Kappa statistic to measure the agreement between raters.
-- ErinEsp - 25 Jul 2010
Topic revision: r1 - 25 Jul 2010 - 15:53:55 - ErinEsp
 

Copyright & by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding CTSPedia? Send feedback