Synthetic Distracted Driving (SynDD2) dataset for analyzing distracted behaviors and various gaze zones of a driver

This article presents a synthetic distracted driving (SynDD2 - a continuum of SynDD1) dataset for machine learning models to detect and analyze drivers' various distracted behavior and different gaze zones. We collected the data in a stationary vehicle using three in-vehicle cameras positioned...

Full description

Saved in:
Bibliographic Details
Main Authors Rahman, Mohammed Shaiqur, Wang, Jiyang, Gursoy, Senem Velipasalar, Anastasiu, David, Wang, Shuo, Sharma, Anuj
Format Journal Article
LanguageEnglish
Published 17.04.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This article presents a synthetic distracted driving (SynDD2 - a continuum of SynDD1) dataset for machine learning models to detect and analyze drivers' various distracted behavior and different gaze zones. We collected the data in a stationary vehicle using three in-vehicle cameras positioned at locations: on the dashboard, near the rearview mirror, and on the top right-side window corner. The dataset contains two activity types: distracted activities and gaze zones for each participant, and each activity type has two sets: without appearance blocks and with appearance blocks such as wearing a hat or sunglasses. The order and duration of each activity for each participant are random. In addition, the dataset contains manual annotations for each activity, having its start and end time annotated. Researchers could use this dataset to evaluate the performance of machine learning algorithms to classify various distracting activities and gaze zones of drivers.
DOI:10.48550/arxiv.2204.08096