Sensor-Based Classification of Primary and Secondary Car Driver Activities Using Convolutional Neural Networks

To drive safely, the driver must be aware of the surroundings, pay attention to the road traffic, and be ready to adapt to new circumstances. Most studies on driving safety focus on detecting anomalies in driver behavior and monitoring cognitive capabilities in drivers. In our study, we proposed a c...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 12; p. 5551
Main Authors Doniec, Rafał, Konior, Justyna, Sieciński, Szymon, Piet, Artur, Irshad, Muhammad Tausif, Piaseczna, Natalia, Hasan, Md Abid, Li, Frédéric, Nisar, Muhammad Adeel, Grzegorzek, Marcin
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 13.06.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To drive safely, the driver must be aware of the surroundings, pay attention to the road traffic, and be ready to adapt to new circumstances. Most studies on driving safety focus on detecting anomalies in driver behavior and monitoring cognitive capabilities in drivers. In our study, we proposed a classifier for basic activities in driving a car, based on a similar approach that could be applied to the recognition of basic activities in daily life, that is, using electrooculographic (EOG) signals and a one-dimensional convolutional neural network (1D CNN). Our classifier achieved an accuracy of 80% for the 16 primary and secondary activities. The accuracy related to activities in driving, including crossroad, parking, roundabout, and secondary activities, was 97.9%, 96.8%, 97.4%, and 99.5%, respectively. The F1 score for secondary driving actions (0.99) was higher than for primary driving activities (0.93-0.94). Furthermore, using the same algorithm, it was possible to distinguish four activities related to activities of daily life that were secondary activities when driving a car.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s23125551