Driver identification using only the CAN-Bus vehicle data through an RCN deep learning approach

In the recent years, many studies claim that humans have a unique driving behavior style that could be used as a fingerprint in recognizing the identity of the driver. With the rising evolution of Machine Learning (ML), the research efforts aiming to take advantage of the human driving style identif...

Full description

Saved in:
Bibliographic Details
Published inRobotics and autonomous systems Vol. 136; p. 103707
Main Authors Abdennour, N., Ouni, T., Amor, N. Ben
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.02.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In the recent years, many studies claim that humans have a unique driving behavior style that could be used as a fingerprint in recognizing the identity of the driver. With the rising evolution of Machine Learning (ML), the research efforts aiming to take advantage of the human driving style identifiers have been increasing exponentially. For Advanced Driver Assistance Systems (ADAS), this attribute can be an efficient factor to ensure the security and protection of the vehicle. Additionally, it extends the ADAS capabilities by creating different profiles for the drivers, which helps every driver according to his own driving style and improve the ADAS fidelity. Nonetheless, certain problems in the unpredictability of human behavior and the effectiveness of capturing the temporal features of the signal represented an ongoing challenge to accomplish driver identification. In this paper, we propose a novel deep learning approach to driver identification based on a Residual Convolutional Network (RCN). This approach outperforms the existing state of the art methods in less than two hours of training, while simultaneously achieving 99.3% accuracy. The used data are exclusively provided by the Controller Area Network (CAN-Bus) vehicle data that eliminates any privacy invading concerns from the user.
ISSN:0921-8890
1872-793X
DOI:10.1016/j.robot.2020.103707