Explaining Errors in Predictions of At-Risk Students in Distance Learning Education

Despite recognising the importance of transparency and understanding of predictive models, little effort has been made to investigate the errors made by these models. In this paper, we address this gap by interviewing 12 students whose results and predictions of submitting their assignment differed....

Full description

Saved in:
Bibliographic Details
Published inArtificial Intelligence in Education Vol. 12164; pp. 119 - 123
Main Authors Hlosta, Martin, Papathoma, Tina, Herodotou, Christothea
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 01.01.2020
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Despite recognising the importance of transparency and understanding of predictive models, little effort has been made to investigate the errors made by these models. In this paper, we address this gap by interviewing 12 students whose results and predictions of submitting their assignment differed. Following our previous quantitative analysis of 25,000+ students, we conducted online interviews with two groups of students: those predicted to submit their assignment, yet they did not (False Negative) and those predicted not to submit, yet they did (False Positive). Interviews revealed that, in False Negatives, the non-submission of assignments was explained by personal, financial and practical reasons. Overall, the factors explaining the different outcomes were not related to any of the student data currently captured by the predictive model.
ISBN:3030522393
9783030522391
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-030-52240-7_22