Construction of an Accurate Tracking and Ai Evaluation System for Dance Movements by Incorporating Image Recognition Technology

In this paper, an industrial camera is first utilized to collect data of human dance posture movements. The action is categorized using the CTC automatic segmentation principle. Then, the human movement posture recognition method is utilized to detect the human dance gesture action information and r...

Full description

Saved in:
Bibliographic Details
Published inApplied mathematics and nonlinear sciences Vol. 10; no. 1
Main Authors Luan, Jianchao, Qu, Huijia, Liu, Yuan
Format Journal Article
LanguageEnglish
Published Beirut Sciendo 01.01.2025
De Gruyter Poland
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, an industrial camera is first utilized to collect data of human dance posture movements. The action is categorized using the CTC automatic segmentation principle. Then, the human movement posture recognition method is utilized to detect the human dance gesture action information and recognize the human dance posture by combining the human joint position characteristics. Finally, the learners’ dance movements are evaluated using the GL-Compound similarity calculation method. The experimental analysis shows that, compared to the other two recognition methods, the cross-combination ratios of the human dance posture detection methods are more than 0.85, and the accuracy of detection is high. The highest recognition rate was found for dances where lower body movements were predominant. In the practical study, both the degree of inner elbow bending and the amplitude of left arm swing of the subject to be tested were significantly different from the standard movements by only 2 s. A high quality score of 0.94 was obtained for 50~100 frame image segments. The dance scoring task can be effectively accomplished using a robust calculation using GL-Compound similarity.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2444-8656
2444-8656
DOI:10.2478/amns-2025-0294