Detecting Human Behavior from a Silhouette Using Convolutional Neural Networks

Human action recognition (HAR) in films has grown rapidly as an academic field in the past few decades. Robotics, HCI, intelligent video surveillance, and sports video analysis are just some of the real-world applications of action recognition. Despite extensive study, there are still many issues to...

Full description

Saved in:
Bibliographic Details
Published in2023 Second International Conference on Electronics and Renewable Systems (ICEARS) pp. 943 - 948
Main Authors Rao, Nidamanuru Srinivasa, Shanmugapriya, G., Vinod, S., S, Raju, Mallick, Sarada Prasanna, Gracewell J, Jeffin
Format Conference Proceeding
LanguageEnglish
Published IEEE 02.03.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Human action recognition (HAR) in films has grown rapidly as an academic field in the past few decades. Robotics, HCI, intelligent video surveillance, and sports video analysis are just some of the real-world applications of action recognition. Despite extensive study, there are still many issues to be settled in this area. Multiple factors, such as the subject's position, speed, illumination, occlusion, viewpoint, and backdrop clutter, contribute to the challenge of the task. An efficient HAR system is able to account for these variants and rapidly identify the human action class. The primary steps in HAR systems are typically foreground segmentation, feature extraction, effective vector representation, and classification. This article proposes a novel method that employs Convolution Neural Network (CNN) to improve the HAR system's classification accuracy. The proposed activity representation and classification approach is evaluated by using public datasets from Weizmann, KTH, and the Ballet Movement. The examination of competing methods shows that our suggested approach provides higher recognition accuracy than existing methods. For the Weizmann dataset, the proposed technique offers 98 percent accuracy, and for the KTH dataset, it offers 95.6% accuracy.
DOI:10.1109/ICEARS56392.2023.10085686