Traffic sign recognition based on deep learning
Intelligent Transportation System (ITS), including unmanned vehicles, has been gradually matured despite on road. How to eliminate the interference due to various environmental factors, carry out accurate and efficient traffic sign detection and recognition, is a key technical problem. However, trad...
Saved in:
Published in | Multimedia tools and applications Vol. 81; no. 13; pp. 17779 - 17791 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.05.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Intelligent Transportation System (ITS), including unmanned vehicles, has been gradually matured despite on road. How to eliminate the interference due to various environmental factors, carry out accurate and efficient traffic sign detection and recognition, is a key technical problem. However, traditional visual object recognition mainly relies on visual feature extraction, e.g., color and edge, which has limitations. Convolutional neural network (CNN) was designed for visual object recognition based on deep learning, which has successfully overcome the shortcomings of conventional object recognition. In this paper, we implement an experiment to evaluate the performance of the latest version of YOLOv5 based on our dataset for Traffic Sign Recognition (TSR), which unfolds how the model for visual object recognition in deep learning is suitable for TSR through a comprehensive comparison with SSD (i.e., single shot multibox detector) as the objective of this paper. The experiments in this project utilize our own dataset. Pertaining to the experimental results, YOLOv5 achieves 97.70% in terms of mAP@0.5 for all classes, SSD obtains 90.14% mAP in the same term. Meanwhile, regarding recognition speed, YOLOv5 also outperforms SSD. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-022-12163-0 |