Building an Automated Scoring System for EFL Learners’ Paraphrases via a Customized GPT

This study investigates the potential of a customized GPT as an automated paraphrase scoring (APS) system for EFL learners’ paraphrases with implications for reducing teachers’ grading workloads and achieving unbiased rating in classroom settings. A total of 1,000 paraphrases written by 100 Korean E...

Full description

Saved in:
Bibliographic Details
Published in영어어문교육, 31(1) pp. 27 - 57
Main Author 김민경
Format Journal Article
LanguageEnglish
Published 한국영어어문교육학회 01.03.2025
Subjects
Online AccessGet full text
ISSN1226-2889

Cover

More Information
Summary:This study investigates the potential of a customized GPT as an automated paraphrase scoring (APS) system for EFL learners’ paraphrases with implications for reducing teachers’ grading workloads and achieving unbiased rating in classroom settings. A total of 1,000 paraphrases written by 100 Korean EFL learners were evaluated with the analytic and holistic scoring rubrics. The analytic rubric included syntactic change, word change, semantic equivalency, and grammatical accuracy, which are crucial to paraphrasing. A mixed-methods approach was employed to evaluate the APS’s reliability and effectiveness. Quantitative analysis examined the reliability and consistency of the APS were with Pearson and Intraclass Correlation Coefficients. Inter-rater reliability of the scores between APS and human raters was analyzed in various comparison and demonstrated strong alignment. Additionally, the consistency of APS across two rubrics indicated moderate reliability overall. Qualitative analysis further investigated the nature of the scores generated by the APS and its pedagogical implications. These findings suggest the APS via a custom GPT has its potential as an automated tool for writing assessment, providing reliable feedback to students while replacing human raters. Blending the automatic evaluation with customized GPTs into classrooms can be a solution for some challenges detected in manual scoring in educational context. KCI Citation Count: 0
ISSN:1226-2889