Educational Game Learning Quality Evaluation Tool: A Systematic Literature Review

This Systematic Literature Review (SLR) aims to examine evaluation tools for measuring the quality of Digital Educational Games (DEG) and identify the most popular evaluation tools and significant evaluation factors. This research uses the PRISMA method to search for journals at IEEE, Springer, and...

Full description

Saved in:
Bibliographic Details
Published in2024 7th International Conference on Informatics and Computational Sciences (ICICoS) pp. 419 - 425
Main Authors Kurniawan, Mei Parwanto, Suyanto, M., Utami, Ema, Kusrini
Format Conference Proceeding
LanguageEnglish
Published IEEE 17.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This Systematic Literature Review (SLR) aims to examine evaluation tools for measuring the quality of Digital Educational Games (DEG) and identify the most popular evaluation tools and significant evaluation factors. This research uses the PRISMA method to search for journals at IEEE, Springer, and Elsevier publishers from November 8, 2023, to December 19, 2023, with specific keywords. Of the 3,808 journals found, 139 journals published from 2020 to 2023 were selected and analyzed using Excel to collect data such as year of publication, title, author, publisher, tool name, tool type, and assessment factors. Of the 139 articles extracted, 35 journals used the DEG evaluation tool. The classification results based on the similarity of evaluation tools identified 12 DEG evaluation tools, including MEEGA+, EgameFlow, TAM, ADDIE, GUESS, HEP, TAUT, EDUGXQ, SPIKES, UEQ, DIJS, and ECS. MEEGA+ was found to be the most frequently used evaluation tool, with 12 references or 23% of the total journals reviewed. Further evaluation using the Analytical Hierarchy Process (AHP) identified control and feedback as evaluation factors that needed further development. This research emphasises the need to develop a more comprehensive and valid DEG evaluation tool to improve the measurement of the quality of digital educational games. In conclusion, these findings provide a solid basis for decision-making regarding DEGs and encourage further research to address gaps in existing evaluation tools.
ISSN:2767-7087
DOI:10.1109/ICICoS62600.2024.10636925