Attention assessment based on multi‐view classroom behaviour recognition

In recent years, artificial intelligence has been applied in many fields, and education has attracted more and more attention. More and more behaviour detection and recognition algorithms are applied in the field of education. Students' attention in class is the key to improving the quality of...

Full description

Saved in:
Bibliographic Details
Published inIET computer vision Vol. 19; no. 1
Main Authors Zheng, ZhouJie, Liang, GuoJun, Luo, HuiBin, Yin, HaiChang
Format Journal Article
LanguageEnglish
Published 01.01.2025
Online AccessGet full text

Cover

Loading…
More Information
Summary:In recent years, artificial intelligence has been applied in many fields, and education has attracted more and more attention. More and more behaviour detection and recognition algorithms are applied in the field of education. Students' attention in class is the key to improving the quality of teaching, and classroom behavior is a direct manifestation of students' attention. In view of the problem that the accuracy of students' classroom behavior recognition is generally low, we apply deep learning to multi‐view behavior detection, which can detect and recognize behaviors from different perspectives, to evaluate students' classroom attention. First, an improved detection model based on YOLOv5 is proposed, which improves the CBL module throughout the entire network to optimize the model and uses SIoU as the loss function to improve the convergence speed of the prediction box. Second, a quantitative evaluation standard for students' classroom attention is established and then training and verification are conducted by collecting multi‐view classroom datasets. Finally, the environment variation in the training model phase is increased to make the model have better generalization ability. Experiments demonstrate that our method can effectively identify and detect students' behaviours in the classroom from different angles, and it has good robustness and feature extraction capabilities.
ISSN:1751-9632
1751-9640
DOI:10.1049/cvi2.12146