Wednesday, 12 June 2019

Inter rater Reliability

1. Calculating Inter Rater Reliability/Agreement in Excel



2. SPSS Tutorial: Inter and Intra rater reliability (Cohen's Kappa, ICC)









https://www.dropbox.com/s/qr8hnaa4fqdy8gx/Kappa%20CI%20calculator%20and%20SEM%20calculator%20.xlsx?dl=0
Calculator of measurement errors



https://www.dropbox.com/s/qr8hnaa4fqdy8gx/Kappa%20CI%20calculator%20and%20SEM%20calculator%20.xlsx?dl=0

https://www.researchgate.net/post/Intraclass_correlation_coefficient_when_data_are_not_normally_distributed


3 What is the difference between intra rater and inter rater relibility 

i.             Carry out a rank order correlation of the first and second sets of marks. 
       This is to check ‘intra-rater reliability’.



  1. i.          Intra-rater reliability


1st time
2nd time
d = R1-R2
d2
Mark
Rank (R1)
Mark
Rank (R2)
Student 1






Student 2






Student 3






Student 4






Student 5






ii.            Correlate the first set of marks with another person’s first set of marks. 
       This is to check ‘inter-rater reliability’.


  1. ii.        Inter-rater reliability


Own 1st time
Other 1st time
d = R1-R0
d2
Mark
Rank (R1)
Mark
Rank (R0)
Student 1






Student 2






Student 3






Student 4






Student 5









Read more on :

Factors That Can Affect The Inter-rater Reliability

There are several factors that can affect inter-rater reliability

  • Knowledge background.

v  Markers’ knowledge background is somehow a factor that can alter the reliability for marking process. It is because their views on certain issues may vary, thus affecting their agreement on the issues. This may lead to differences in evaluation during the marking.

  • Markers’ experience in marking.

v  A group of markers’ with distinct differences in experiences may lead to disagreement among them. Experience may alter their perceptions towards candidates’ performances, thus becoming a major factor that can affect the inter-rater reliability.


No comments:

Post a Comment