Multi-temporal image fusion empowered convolutional neural networks for recognition of 9 common mice actions
The study of complex behaviors and social interactions necessitates precise and efficient methodologies for the recognition and tracking of animal actions. However, existing methods such as depth perception and wearable devices for mice behavior recognition pose risks of physical harm to the subject...
Saved in:
Published in | Knowledge-based systems Vol. 320; p. 113628 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
23.06.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The study of complex behaviors and social interactions necessitates precise and efficient methodologies for the recognition and tracking of animal actions. However, existing methods such as depth perception and wearable devices for mice behavior recognition pose risks of physical harm to the subjects and exhibit limited applicability across species with low precision. To redress these deficiencies, this paper proposes the multi-temporal image fusion empowered Convolutional Neural Networks (CNN), aimed at achieving accurate and efficient recognition of nine common mice actions. In this study, we employ mice at various time points as subjects and employ a multi-temporal approach to process image sequences, integrating various frame difference extraction techniques to address the limitations inherent in single-frame prediction for capturing dynamic changes in actions. Subsequently, we utilize a Deformable Convolution Network (DCN) in conjunction with multi-stacked residual units to enhance the feature extraction capacity of the CNN, particularly focusing on mice action contours, while mitigating the risk of overfitting. Furthermore, we investigate the efficacy of fused images derived from varying frame differences in representing the nine actions, culminating in the establishment of a robust mice action recognition model through ensemble learning techniques. Experimental findings demonstrate an impressive precision rate of 92.9% in recognizing mice actions. The proposed method effectively eliminates background interference and exhibits superior generalization and adaptability properties. |
---|---|
ISSN: | 0950-7051 |
DOI: | 10.1016/j.knosys.2025.113628 |