Development of Deep Neural Network Model for the Prediction of Road Crashes in Real Time

Road safety remains a global concern with the number of deaths and injury recorded from road traffic accidents estimated to be 1.5 million and 50 million respectively by 2025. Despite being predictable and largely preventable, the trend of road traffic crash is on the rise in Nigeria with an annual...

Full description

Saved in:
Bibliographic Details
Published inJournal of Engineering Research and Reports pp. 25 - 33
Main Authors Olokun, M. S., Ipindola, O. O., Oyediji, F. T., Falana, J. N.
Format Journal Article
LanguageEnglish
Published 30.06.2022
Online AccessGet full text

Cover

Loading…
More Information
Summary:Road safety remains a global concern with the number of deaths and injury recorded from road traffic accidents estimated to be 1.5 million and 50 million respectively by 2025. Despite being predictable and largely preventable, the trend of road traffic crash is on the rise in Nigeria with an annual average of 33.7 deaths per 100,000 people. Proactive technique such as real time traffic and crash prediction has the potential to reduce the likelihood of crashes and to improve post-crash response. GoogleNet Convolutional Neural Network was developed in this study to classify road conditions and predict crashes along Ondo – Akure single carriage highway in Nigeria. Traffic flow relationships were established for the empirical data collected through video technique and compared to Green shields, Greenberg and Underwood models. The results were found generally satisfactory at an average coefficient of correlation of 0.96. The developed GoogleNet Convolutional network performed quite satisfactorily at predicting the probability of different traffic conditions – congested traffic (0.98), free-flowing traffic (0.64) and traffic crash (0.94). The developed algorithm can be integrated with traffic cameras and crowd-sourced images in areas that are not within the reach of surveillance cameras and sensors to report traffic condition in real time.
ISSN:2582-2926
2582-2926
DOI:10.9734/jerr/2022/v22i1017570