Applying Mixup for Time Series in Transformer-Based Human Activity Recognition
Transformer models have significantly advanced various areas of Artificial Intelligence and Machine Learning, including Computer Vision and Natural Language Processing. Despite their popularity in these fields, it is still rare to see transformer-based models in Human Activity Recognition (HAR). In...
Saved in:
Published in | 2024 IEEE/WIC International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT) pp. 550 - 555 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
09.12.2024
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/WI-IAT62293.2024.00088 |
Cover
Loading…
Summary: | Transformer models have significantly advanced various areas of Artificial Intelligence and Machine Learning, including Computer Vision and Natural Language Processing. Despite their popularity in these fields, it is still rare to see transformer-based models in Human Activity Recognition (HAR). In this research, we explore the application of transformer models to HAR, which involves time-series data collected from sensors attached to human subjects. We incorporate mixup data augmentation, a technique primarily used in vision and language tasks, modified for time-series activities to enhance activity detection while preserving the time-series characteristics of the data. We believe that HAR data is well-suited for mixup augmentation due to the low-resource nature of various everyday human activities. The results from our experiments show that activity recognition models benefit from mixup data augmentation even though they are dealing with time-series data. Our methodology was tested on four different HAR datasets, and the results consistently demonstrated that mixup augmentation improved model accuracy. This study provides a novel approach to augmenting time-series data in HAR tasks, highlighting the potential of transformers with mixup data augmentation in improving activity recognition performance. |
---|---|
DOI: | 10.1109/WI-IAT62293.2024.00088 |