Transportation Mode Detection Using Temporal Convolutional Networks Based on Sensors Integrated into Smartphones

In recent years, with the development of science and technology, people have more and more choices for daily travel. However, assisting with various mobile intelligent services by transportation mode detection has become more urgent for the refinement of human activity identification. Although much...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 22; no. 17; p. 6712
Main Authors Wang, Pu, Jiang, Yongguo
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.09.2022
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In recent years, with the development of science and technology, people have more and more choices for daily travel. However, assisting with various mobile intelligent services by transportation mode detection has become more urgent for the refinement of human activity identification. Although much work has been done on transportation mode detection, accurate and reliable transportation mode detection remains challenging. In this paper, we propose a novel transportation mode detection algorithm, namely T2Trans, based on a temporal convolutional network (i.e., TCN), which employs multiple lightweight sensors integrated into a phone. The feature representation learning of multiple preprocessed sensor data using temporal convolutional networks can improve transportation mode detection accuracy and enhance learning efficiency. Extensive experimental results demonstrated that our algorithm attains a macro F1-score of 86.42% on the real-world SHL dataset and 88.37% on the HTC dataset, with an average accuracy of 86.37% on the SHL dataset and 89.13% on the HTC dataset. Our model can better identify eight transportation modes, including stationary, walking, running, cycling, car, bus, subway, and train, with better transportation mode detection accuracy, and outperform other benchmark algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s22176712