Design and usability testing of an in-house developed performance feedback tool for medical students

Background Feedback is essential in a self-regulated learning environment such as medical education. When feedback channels are widely spread, the need arises for a system of integrating this information in a single platform. This article reports on the design and initial testing of a feedback tool...

Full description

Saved in:
Bibliographic Details
Published inBMC medical education Vol. 21; no. 1; pp. 1 - 9
Main Authors Roa Romero, Yadira, Tame, Hannah, Holzhausen, Ylva, Petzold, Mandy, Wyszynski, Jan-Vincent, Peters, Harm, Alhassan-Altoaama, Mohammed, Domanska, Monika, Dittmar, Martin
Format Journal Article
LanguageEnglish
Published London BioMed Central Ltd 23.06.2021
BioMed Central
BMC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Background Feedback is essential in a self-regulated learning environment such as medical education. When feedback channels are widely spread, the need arises for a system of integrating this information in a single platform. This article reports on the design and initial testing of a feedback tool for medical students at Charité-Universitätsmedizin, Berlin, a large teaching hospital. Following a needs analysis, we designed and programmed a feedback tool in a user-centered approach. The resulting interface was evaluated prior to release with usability testing and again post release using quantitative/qualitative questionnaires. Results The tool we created is a browser application for use on desktop or mobile devices. Students log in to see a dashboard of "cards" featuring summaries of assessment results, a portal for the documentation of acquired practical skills, and an overview of their progress along their course. Users see their cohort's average for each format. Learning analytics rank students' strengths by subject. The interface is characterized by colourful and simple graphics. In its initial form, the tool has been rated positively overall by students. During testing, the high task completion rate (78%) and low overall number of non-critical errors indicated good usability, while the quantitative data (system usability scoring) also indicates high ease of use. The source code for the tool is open-source and can be adapted by other medical faculties. Conclusions The results suggest that the implemented tool LevelUp is well-accepted by students. It therefore holds promise for improved, digitalized integrated feedback about students' learning progress. Our aim is that LevelUp will help medical students to keep track of their study progress and reflect on their skills. Further development will integrate users' recommendations for additional features as well as optimizing data flow. Keywords: Assessment feedback, Usability testing, Formative and summative assessment, Learning analytics, EPAs
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1472-6920
1472-6920
DOI:10.1186/s12909-021-02788-4