Publications

When coders are reliable: the application of three measures to assess inter-rater reliability/agreement with doctor-patient communication data coded with the VR-CoDES.  (2011)

Authors:
I., Fletcher; Mazzi, Maria Angela; M., Nuebling
Title:
When coders are reliable: the application of three measures to assess inter-rater reliability/agreement with doctor-patient communication data coded with the VR-CoDES.
Year:
2011
Type of item:
Articolo in Rivista
Tipologia ANVUR:
Articolo su rivista
Language:
Inglese
Format:
A Stampa
Referee:
Name of journal:
Patient Education and Counseling
ISSN of journal:
0738-3991
N° Volume:
82
Page numbers:
341-345
Keyword:
Inter-rater study; Kappa; Intraclass correlation coefficient; Sensitivity and specificity; VR-CoDES
Short description of contents:
To investigate whether different measures of inter-rater reliability will compute similar estimates with nominal data commonly encountered in communication studies. To make recommendations how reliability should be computed and described for communication coding instruments.The raw data from an inter-rater study with three coders were analysed with; Cohen's κ, sensitivity and specificity measures, Fleiss's multirater κj, and an intraclass correlation coefficient (ICC).Minor differences were found between Cohen's κ and an ICC model across paired data (largest margin=0.01). There were negligible differences between the multirater estimates e.g. κj (0.52) and ICC (0.53). Sensitivity analyses were in general agreement with the multirater estimates.It is more practical to analyse nominal data with >2 raters with an appropriate model ICC for inter-rater studies, and little difference exists between Cohen's κ or an ICC.Alternatives to Cohen's κ are readily available, but researchers need to be aware of the different ICC definitions. An ICC model should be fully described in reports. Investigators are encouraged to supply confidence limits with inter-rater data, and to revisit guidance regarding the relative strengths of agreement of reliability coefficients.
Web page:
http://dx.doi.org/10.1016/j.pec.2011.01.004
Product ID:
60646
Handle IRIS:
11562/353343
Deposited On:
May 27, 2011
Last Modified:
November 15, 2022
Bibliographic citation:
I., Fletcher; Mazzi, Maria Angela; M., Nuebling, When coders are reliable: the application of three measures to assess inter-rater reliability/agreement with doctor-patient communication data coded with the VR-CoDES. «Patient Education and Counseling» , vol. 822011pp. 341-345

Consulta la scheda completa presente nel repository istituzionale della Ricerca di Ateneo IRIS

<<back

Activities

Research facilities

Share