LLM supporting knowledge tracing leveraging global subject and student specific knowledge graphs

In this paper, we propose a novel LLM-based KT model, called the Teacher Thinking Knowledge Tracing model (2T-KT), to solve the issue that traditional knowledge tracing methods relying on numerous student exercise records cannot make good predictions when predicting new knowledge concepts by leverag...

Full description

Saved in:
Bibliographic Details
Published inInformation fusion Vol. 126; p. 103577
Main Authors Li, Linqing, Wang, Zhifeng, Jose, Joemon M., Ge, Xuri
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.02.2026
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we propose a novel LLM-based KT model, called the Teacher Thinking Knowledge Tracing model (2T-KT), to solve the issue that traditional knowledge tracing methods relying on numerous student exercise records cannot make good predictions when predicting new knowledge concepts by leveraging the excellent abilities of reasoning and generation from large language model (LLM). The 2T-KT model leverages large language models (LLMs) to enrich the knowledge graph with new knowledge concepts and predict student performance on the next exercise by four key components, i.e. observation, guideline, interpretation, and cognition. In particular, there are two stages, the preprocessing stage, and the 2T-KT stage, to predict the student’s performance on the next exercise. In the preprocessing stage, two novel local and global knowledge graphs are first designed to improve the capability of evaluating new concepts. In the 2T-KT stage, a novel teacher’s thinking mode is designed to include four key components, i.e. observation, guideline, interpretation, and cognition to assist the LLM in predicting the student’s performance on the next exercise. This exercise contains new knowledge concepts. Finally, even with new concepts, the LLM ‘teacher’ can accurately predict students’ abilities through interpretable augmentation prompts. Extensive evaluations on three public educational benchmarks—the FrcSub dataset, comprising 10K student records and 8 exercises, and the Xes3g5m dataset, comprising around 522K student records and 6,641 exercises. In addition, the MOOCRadar dataset contains around 897K student records and 2510 exercise records to test our model’s performance. It demonstrates that our 2T-KT model is a strong contender in knowledge tracing, delivering both high performance and interpretability. •Completion and verification methods are designed to add and verify concepts.•We design LLMs with new KGs to model Teacher Thinking Mode via four components.•We evaluate 2T-KT on three benchmarks. It outperforms state-of-the-art methods.
ISSN:1566-2535
DOI:10.1016/j.inffus.2025.103577