Crowd-sourced assessment of technical skills: an opportunity for improvement in the assessment of laparoscopic surgical skills
Abstract Background Objective, unbiased assessment of surgical skills remains a challenge in surgical education. We sought to evaluate the feasibility and reliability of Crowd-Sourced Assessment of Technical Skills. Methods Seven volunteer general surgery interns were given time for training and the...
Saved in:
Published in | The American journal of surgery Vol. 211; no. 2; pp. 398 - 404 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
Elsevier Inc
01.02.2016
Elsevier Limited |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Abstract Background Objective, unbiased assessment of surgical skills remains a challenge in surgical education. We sought to evaluate the feasibility and reliability of Crowd-Sourced Assessment of Technical Skills. Methods Seven volunteer general surgery interns were given time for training and then testing, on laparoscopic peg transfer, precision cutting, and intracorporeal knot-tying. Six faculty experts (FEs) and 203 Amazon.com Mechanical Turk crowd workers (CWs) evaluated 21 deidentified video clips using the Global Objective Assessment of Laparoscopic Skills validated rating instrument. Results Within 19 hours and 15 minutes we received 662 eligible ratings from 203 CWs and 126 ratings from 6 FEs over 10 days. FE video ratings were of borderline internal consistency (Krippendorff’s alpha = .55). FE ratings were highly correlated with CW ratings (Pearson’s correlation coefficient = .78, P < .001). Conclusion We propose the use of Crowd-Sourced Assessment of Technical Skills as a reliable, basic tool to standardize the evaluation of technical skills in general surgery. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0002-9610 1879-1883 |
DOI: | 10.1016/j.amjsurg.2015.09.005 |