Seven reliability indices for high-stakes decision making: Description, selection, and simple calculation
The reliability of data is a critical issue in decision‐making for practitioners in the school. Percent Agreement and Cohen's kappa are the two most widely reported indices of inter‐rater reliability, however, a recent Monte Carlo study on the reliability of multi‐category scales found other in...
Saved in:
Published in | Psychology in the schools Vol. 48; no. 10; pp. 1064 - 1075 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Hoboken
Wiley Subscription Services, Inc., A Wiley Company
01.12.2011
John Wiley & Sons, Inc Wiley |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The reliability of data is a critical issue in decision‐making for practitioners in the school. Percent Agreement and Cohen's kappa are the two most widely reported indices of inter‐rater reliability, however, a recent Monte Carlo study on the reliability of multi‐category scales found other indices to be more trustworthy given the type of data and number of categories. This manuscript presents defensible decision steps, methods, and rationale for selecting and calculating inter‐rater reliability for practitioners. In addition to using screen shots from readily available online programs for calculation such as Excel and Vassar College (Lowry, 2010), decision guides and a flow chart are presented in several figures and tables for easy reference. © 2011 Wiley Periodicals, Inc. |
---|---|
Bibliography: | istex:958B332AE66F50CA64948C6AC812F215E368DE58 ark:/67375/WNG-1WBXZLWG-M ArticleID:PITS20610 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0033-3085 1520-6807 |
DOI: | 10.1002/pits.20610 |