Content validity and inter-rater reliability of procedural skill checklists used in the online OSCE scoring management system

Introduction: The use of online Objective Structured Clinical Examination (OSCE) scoring system in medical and nursing education is emerging. To ensure valid student’ score, the OSCE check lists need to be assessed for its validity and reliability. This study aims to test the validity and reliabilit...

Full description

Saved in:
Bibliographic Details
Published inBali medical journal Vol. 12; no. 1; pp. 456 - 461
Main Authors Indarwati, Ferika, Primanda, Yanuar, Haris, Fahni, Yulianti Sutrisno, Resti
Format Journal Article
LanguageEnglish
Published 01.01.2023
Online AccessGet full text

Cover

Loading…
More Information
Summary:Introduction: The use of online Objective Structured Clinical Examination (OSCE) scoring system in medical and nursing education is emerging. To ensure valid student’ score, the OSCE check lists need to be assessed for its validity and reliability. This study aims to test the validity and reliability of several procedural check lists commonly used in nursing profession such as peripheral intravenous insertion, electrocardiogram placement, nasogastric tube insertion, urinary catheter insertion and oxygenation procedure. Methods: The expert consensus was used to generate items, content validity index and inter-rater reliability was used to evaluate the validity and reliability of the check lists. Five experts assessed the content validity of the checklists and five raters used the check list to evaluate performance of 11 students. Data were collected from April to May 2022. The panel experts rated the content relevance of each instrument using a four-point rating scale. Item level and scale level content validity index were calculated. Inter-rater reliability was calculated using Fleiss Kappa. Results: The item validity index for the five check lists reviewed showed relatively high content validity among experts. I-CVI for each tool was very good, ranging from 0.8 to 1. The average content agreement (S-CVI/Ave) and the universal agreement (S-CVI/UA) for each check list were also very good. The inter-rater reliability results indicated that the agreement among raters was ranged from moderate to very good/excellent. The lowest Kappa value was for the nasogastric insertion check list (0.40, 95%CI 0.40 – 0.41) and the highest Kappa value was for the oxygenation check list (1, 95% CI 0.99 – 1). Conclusion: The face validity was reported as easy to understand and presented logically. Nonetheless, re-formatting of some items and addition of details in the checklists are needed to avoid ambiguity, which could lead to confusion for the examiner and examinee.
ISSN:2089-1180
2302-2914
DOI:10.15562/bmj.v12i1.3760