Head-camera video recordings of trauma core competency procedures can evaluate surgical resident's technical performance as well as colocated evaluators

Unbiased evaluation of trauma core competency procedures is necessary to determine if residency and predeployment training courses are useful. We tested whether a previously validated individual procedure score (IPS) for individual procedure vascular exposure and fasciotomy (FAS) performance skills...

Full description

Saved in:
Bibliographic Details
Published inThe journal of trauma and acute care surgery Vol. 83; no. 1 Suppl 1; p. S124
Main Authors Mackenzie, Colin F, Pasley, Jason, Garofalo, Evan, Shackelford, Stacy, Chen, Hegang, Longinaker, Nyaradzo, Granite, Guinevere, Pugh, Kristy, Hagegeorge, George, Tisherman, Samuel A
Format Journal Article
LanguageEnglish
Published United States 01.07.2017
Subjects
Online AccessGet more information

Cover

Loading…
More Information
Summary:Unbiased evaluation of trauma core competency procedures is necessary to determine if residency and predeployment training courses are useful. We tested whether a previously validated individual procedure score (IPS) for individual procedure vascular exposure and fasciotomy (FAS) performance skills could discriminate training status by comparing IPS of evaluators colocated with surgeons to blind video evaluations. Performance of axillary artery (AA), brachial artery (BA), and femoral artery (FA) vascular exposures and lower extremity FAS on fresh cadavers by 40 PGY-2 to PGY-6 residents was video-recorded from head-mounted cameras. Two colocated trained evaluators assessed IPS before and after training. One surgeon in each pretraining tertile of IPS for each procedure was randomly identified for blind video review. The same 12 surgeons were video-recorded repeating the procedures less than 4 weeks after training. Five evaluators independently reviewed all 96 randomly arranged deidentified videos. Inter-rater reliability/consistency, intraclass correlation coefficients were compared by colocated versus video review of IPS, and errors. Study methodology and bias were judged by Medical Education Research Study Quality Instrument and the Quality Assessment of Diagnostic Accuracy Studies criteria. There were no differences (p ≥ 0.5) in IPS for AA, FA, FAS, whether evaluators were colocated or reviewed video recordings. Evaluator consistency was 0.29 (BA) - 0.77 (FA). Video and colocated evaluators were in total agreement (p = 1.0) for error recognition. Intraclass correlation coefficient was 0.73 to 0.92, dependent on procedure. Correlations video versus colocated evaluations were 0.5 to 0.9. Except for BA, blinded video evaluators discriminated (p < 0.002) whether procedures were performed before training versus after training. Study methodology by Medical Education Research Study Quality Instrument criteria scored 15.5/19, Quality Assessment of Diagnostic Accuracy Studies 2 showed low bias risk. Video evaluations of AA, FA, and FAS procedures with IPS are unbiased, valid, and have potential for formative assessments of competency. Prognostic study, level II.
ISSN:2163-0763
DOI:10.1097/TA.0000000000001467