Diagnosing Cognitive Proficiency of Students Using Dense Neural Networks for Adaptive Assistance

Contribution: This article suggests a novel method for diagnosing a learner's cognitive proficiency using deep neural networks (DNNs) based on her answers to a series of questions. The outcome of the forecast can be used for adaptive assistance. Background: Often a learner spends considerable a...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on education pp. 1 - 10
Main Authors Meher, Jyoti Prakash, Mall, Rajib
Format Journal Article
LanguageEnglish
Published IEEE 28.08.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Contribution: This article suggests a novel method for diagnosing a learner's cognitive proficiency using deep neural networks (DNNs) based on her answers to a series of questions. The outcome of the forecast can be used for adaptive assistance. Background: Often a learner spends considerable amounts of time in attempting questions on the concepts she has already mastered. Therefore, it is desirable to appropriately diagnose her cognitive proficiency and select the questions that can help improve preparedness. Research Question: Can the cognitive proficiency of a learner be progressively predicted when she attempts a series of questions? Methodology: A novel approach using DNNs to diagnose the learner's proficiency after she attempts a set of questions is proposed in this article. Subsequently, to realize the effectiveness of the proposed prediction model, an algorithm is introduced that can select questions of required difficulty based on the predicted proficiency level. An appropriate question sequence can facilitate a learner's faster attainment of the necessary competency level. Findings: The experimental results indicate that the proposed approach can predict the ability of learners with an accuracy of 91.21%. Moreover, the proposed technique outperforms the existing techniques by 33.19% on an average.
ISSN:0018-9359
1557-9638
DOI:10.1109/TE.2024.3446316