Validation of a Time-Distributed residual LSTM–CNN and BiLSTM for equine behavior recognition using collar-worn sensors
•Deep neural network recognizes equine behavior from collar worn sensor data.•Framework combines local spatiotemporal and long-term temporal feature extraction.•Model achieved > 93 % accuracy in 10-fold and > 85 % in leave-one-out cross-validation.•Performance varied across behaviors and housi...
Saved in:
Published in | Computers and electronics in agriculture Vol. 231; p. 109999 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.04.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •Deep neural network recognizes equine behavior from collar worn sensor data.•Framework combines local spatiotemporal and long-term temporal feature extraction.•Model achieved > 93 % accuracy in 10-fold and > 85 % in leave-one-out cross-validation.•Performance varied across behaviors and housing conditions.•Model recognized locomotion, resting and feeding behavior with high accuracy.
Equine daily behavior is a key welfare indicator, offering insights into how environmental and training conditions influence health and well-being. Continuous direct behavior observation, however, is labor-intensive and impractical for large-scale studies. While advances in wearable sensors and deep learning have revolutionized human and animal activity recognition, automated wearable sensor systems for recognizing a diverse repertoire of equine daily behaviors remain limited.
We propose a hierarchical deep learning framework combining a Time-Distributed Residual LSTM-CNN for extracting local spatiotemporal features from short subsegments of sensor data and a bidirectional LSTM (BiLSTM) for capturing long-term temporal dependencies. Our model was validated using approximately 60 h of tri-axial accelerometer and gyroscope data collected from 10 horses wearing collar-mounted sensors. Fifteen daily behaviors were labeled based on video recordings. The model achieved an overall classification accuracy of > 93 % in 10-fold cross-validation and > 85 % in leave-one-subject-out cross-validation. The classification performance was significantly affected by housing conditions and the associated varying frequency of behaviors in the dataset.
This study provides a valid framework for sensor-based automatic behavior recognition in horses, capable of capturing both local spatiotemporal and long-term temporal dependencies from raw sensor data. Our proposed framework enables scalable and reliable monitoring of equine daily behaviors and makes an important contribution to the development of automated, data-driven approaches to equine welfare assessment. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0168-1699 |
DOI: | 10.1016/j.compag.2025.109999 |