Traffic Sign Detection via Improved Sparse R-CNN for Autonomous Vehicles

Traffic sign detection is an important component of autonomous vehicles. There is still a mismatch problem between the existing detection algorithm and its practical application in real traffic scenes, which is mainly due to the detection accuracy and data acquisition. To tackle this problem, this s...

Full description

Saved in:
Bibliographic Details
Published inJournal of advanced transportation Vol. 2022; pp. 1 - 16
Main Authors Liang, Tianjiao, Bao, Hong, Pan, Weiguo, Pan, Feng
Format Journal Article
LanguageEnglish
Published London Hindawi 01.03.2022
John Wiley & Sons, Inc
Wiley
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Traffic sign detection is an important component of autonomous vehicles. There is still a mismatch problem between the existing detection algorithm and its practical application in real traffic scenes, which is mainly due to the detection accuracy and data acquisition. To tackle this problem, this study proposed an improved sparse R-CNN that integrates coordinate attention block with ResNeSt and builds a feature pyramid to modify the backbone, which enables the extracted features to focus on important information, and improves the detection accuracy. In order to obtain more diverse data, the augmentation method used is specifically designed for complex traffic scenarios, and we also present a traffic sign dataset in this study. For on-road autonomous vehicles, we designed two modules, self-adaption augmentation (SAA) and detection time augmentation (DTA), to improve the robustness of the detection algorithm. The evaluations on traffic sign datasets and on-road testing demonstrate the accuracy and effectiveness of the proposed method.
AbstractList Traffic sign detection is an important component of autonomous vehicles. There is still a mismatch problem between the existing detection algorithm and its practical application in real traffic scenes, which is mainly due to the detection accuracy and data acquisition. To tackle this problem, this study proposed an improved sparse R-CNN that integrates coordinate attention block with ResNeSt and builds a feature pyramid to modify the backbone, which enables the extracted features to focus on important information, and improves the detection accuracy. In order to obtain more diverse data, the augmentation method used is specifically designed for complex traffic scenarios, and we also present a traffic sign dataset in this study. For on-road autonomous vehicles, we designed two modules, self-adaption augmentation (SAA) and detection time augmentation (DTA), to improve the robustness of the detection algorithm. The evaluations on traffic sign datasets and on-road testing demonstrate the accuracy and effectiveness of the proposed method.
Audience Academic
Author Bao, Hong
Pan, Feng
Pan, Weiguo
Liang, Tianjiao
Author_xml – sequence: 1
  givenname: Tianjiao
  orcidid: 0000-0002-6062-6166
  surname: Liang
  fullname: Liang, Tianjiao
  organization: Beijing Key Laboratory of Information Service EngineeringBeijing Union UniversityBeijingChinabuu.edu.cn
– sequence: 2
  givenname: Hong
  surname: Bao
  fullname: Bao, Hong
  organization: Beijing Key Laboratory of Information Service EngineeringBeijing Union UniversityBeijingChinabuu.edu.cn
– sequence: 3
  givenname: Weiguo
  orcidid: 0000-0002-2293-1004
  surname: Pan
  fullname: Pan, Weiguo
  organization: Beijing Key Laboratory of Information Service EngineeringBeijing Union UniversityBeijingChinabuu.edu.cn
– sequence: 4
  givenname: Feng
  orcidid: 0000-0002-7927-456X
  surname: Pan
  fullname: Pan, Feng
  organization: Beijing Key Laboratory of Information Service EngineeringBeijing Union UniversityBeijingChinabuu.edu.cn
BookMark eNp9kl1rFDEUhgep4LZ65w8Y8FKnzcckM7lc1moXSgVbvQ0nX7NZdpM1man478069QuqBBJ4ec57zsk5p9VJiMFW1UuMzjFm7IIgQi5oTxij5Em1IKglDcWCnVQLhEXX8I6IZ9VpzluEqGCiXVRXdwmc87q-9UOo39rR6tHHUN97qNf7Q4r31tS3B0jZ1h-b1c1N7WKql9MYQ9zHKdef7cbrnc3Pq6cOdtm-eHjPqk_vLu9WV831h_fr1fK60Qx3Y9M7ppg2nCktQDmsWi1Mi0Br3fXcKcJFzwkyrbKqCFYbgQEh5axTGAlFz6r17GsibOUh-T2kbzKClz-EmAYJaTyWJBUT1ADRmmnS9mAE7WiPeE8d60QLpni9mr1Kn18mm0e5jVMKpXxJOO1Z2xNKCtXM1ADF1AcXxwR6sMEm2JUBOF_kJRcd5xRhWvjzR_hyjN17_WjAmz8C1JR9sLlc2Q-bMQ8w5fw3TmZcp5hzsk5qP8JxaiWP30mM5HEb5HEb5MM2_M7xK-jn1_0Dfz3jGx8MfPX_p78Dv1XB0w
CitedBy_id crossref_primary_10_1049_ipr2_13141
crossref_primary_10_3390_app12125972
crossref_primary_10_3390_s22134833
crossref_primary_10_3390_ijgi13030104
crossref_primary_10_1109_ACCESS_2023_3266284
crossref_primary_10_3390_electronics12122739
crossref_primary_10_3390_rs14143498
crossref_primary_10_1109_ACCESS_2023_3332475
crossref_primary_10_3390_electronics12020305
crossref_primary_10_1109_ACCESS_2024_3349978
crossref_primary_10_3390_app12189366
crossref_primary_10_1109_ACCESS_2023_3293532
crossref_primary_10_1109_ACCESS_2024_3378748
crossref_primary_10_1109_ACCESS_2023_3324146
crossref_primary_10_1109_ACCESS_2025_3534321
crossref_primary_10_1109_ACCESS_2023_3329713
crossref_primary_10_1109_ACCESS_2024_3462629
crossref_primary_10_3390_app13105901
crossref_primary_10_3934_electreng_2023016
crossref_primary_10_1109_ACCESS_2023_3323618
crossref_primary_10_3390_s22218097
crossref_primary_10_1007_s00371_024_03287_5
crossref_primary_10_1038_s41598_025_94610_0
crossref_primary_10_1109_ACCESS_2024_3357781
crossref_primary_10_1016_j_oceaneng_2024_119600
crossref_primary_10_1109_ACCESS_2023_3333894
crossref_primary_10_1109_ACCESS_2024_3435384
crossref_primary_10_1109_ACCESS_2024_3437642
crossref_primary_10_1109_ACCESS_2023_3322371
crossref_primary_10_1109_ACCESS_2023_3289586
crossref_primary_10_3390_s25010230
crossref_primary_10_1109_ACCESS_2023_3347352
crossref_primary_10_1007_s11554_023_01403_7
crossref_primary_10_3390_app13074533
crossref_primary_10_3389_fbioe_2022_944944
crossref_primary_10_3389_frobt_2024_1212070
crossref_primary_10_1109_ACCESS_2023_3263479
crossref_primary_10_1109_ACCESS_2023_3321966
crossref_primary_10_1109_ACCESS_2022_3166923
crossref_primary_10_1109_ACCESS_2023_3306951
crossref_primary_10_1049_ipr2_13056
crossref_primary_10_1109_ACCESS_2023_3256723
crossref_primary_10_1109_ACCESS_2023_3339775
crossref_primary_10_3934_mbe_2023851
crossref_primary_10_1109_ACCESS_2023_3315589
crossref_primary_10_1155_2022_4285436
crossref_primary_10_1109_ACCESS_2025_3529289
crossref_primary_10_1109_ACCESS_2024_3470815
crossref_primary_10_3390_axioms13050335
Cites_doi 10.1109/CVPR.2016.91
10.1109/TPAMI.2020.3032166
10.1109/CVPR.2016.90
10.1109/CVPR46437.2021.01422
10.1109/CVPR.2014.81
10.1109/TITS.2012.2209421
10.1109/UBMYK48245.2019.8965590
10.1109/CVPR42600.2020.01160
10.1109/TPAMI.2019.2913372
10.1109/JIOT.2020.3034899
10.1109/tnsm.2021.3098157
10.1109/IJCNN.2013.6706807
10.1145/3065386
10.1109/CVPR46437.2021.01350
10.1109/ACCESS.2020.3047414
10.1109/CVPR.2018.00644
10.1007/978-3-319-46448-0_2
10.1609/aaai.v34i07.6999
10.1109/TPAMI.2016.2577031
10.1109/ACCESS.2021.3094201
10.1109/TPAMI.2018.2844175
10.1109/ACCESS.2021.3059052
10.1109/CVPR.2018.00647
10.1007/978-3-030-01234-2_1
10.1109/CVPR.2017.690
10.1109/CVPR.2016.232
10.1109/TITS.2019.2913588
10.1109/TCCN.2017.2758370
10.3390/a10040127
10.1007/978-3-030-58452-8_13
10.1109/CVPR.2005.177
10.1109/5254.708428
10.1109/TPAMI.2018.2858826
10.1109/ICOSP.2014.7015147
10.1109/CVPR.2017.634
10.1109/TNSM.2019.2899085
10.1007/978-3-319-48890-5_20
10.1109/ICCV.2015.169
ContentType Journal Article
Copyright Copyright © 2022 Tianjiao Liang et al.
COPYRIGHT 2022 John Wiley & Sons, Inc.
Copyright © 2022 Tianjiao Liang et al. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: Copyright © 2022 Tianjiao Liang et al.
– notice: COPYRIGHT 2022 John Wiley & Sons, Inc.
– notice: Copyright © 2022 Tianjiao Liang et al. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID RHU
RHW
RHX
AAYXX
CITATION
N95
3V.
7ST
7WY
7WZ
7XB
87Z
8FD
8FE
8FG
8FK
8FL
ABJCF
ABUWG
AEUYN
AFKRA
ARAPS
AZQEC
BENPR
BEZIV
BGLVJ
C1K
CCPQU
DWQXO
FR3
FRNLG
F~G
HCIFZ
K60
K6~
KR7
L.-
L6V
M0C
M7S
P5Z
P62
PHGZM
PHGZT
PIMPY
PKEHL
PQBIZ
PQBZA
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
Q9U
SOI
DOA
DOI 10.1155/2022/3825532
DatabaseName Hindawi Publishing Complete
Hindawi Publishing Subscription Journals
Hindawi Publishing Open Access
CrossRef
Gale Business: Insights
ProQuest Central (Corporate)
Environment Abstracts
ABI/INFORM Collection
ABI/INFORM Global (PDF only)
ProQuest Central (purchase pre-March 2016)
ABI/INFORM Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ABI/INFORM Collection (Alumni)
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest One Sustainability
ProQuest Central UK/Ireland
Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Central
Business Premium Collection
Technology Collection
Environmental Sciences and Pollution Management
ProQuest One Community College
ProQuest Central Korea
Engineering Research Database
Business Premium Collection (Alumni)
ABI/INFORM Global (Corporate)
SciTech Collection (ProQuest)
ProQuest Business Collection (Alumni Edition)
ProQuest Business Collection
Civil Engineering Abstracts
ABI/INFORM Professional Advanced
ProQuest Engineering Collection
ABI/INFORM Collection (ProQuest)
Engineering Database
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Business
ProQuest One Business (Alumni)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
ProQuest Central Basic
Environment Abstracts
DOAJ: Directory of Open Access Journal (DOAJ)
DatabaseTitle CrossRef
Publicly Available Content Database
ABI/INFORM Global (Corporate)
ProQuest Business Collection (Alumni Edition)
ProQuest One Business
Technology Collection
Technology Research Database
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Central China
ABI/INFORM Complete
Environmental Sciences and Pollution Management
ProQuest Central
ABI/INFORM Professional Advanced
ProQuest One Applied & Life Sciences
ProQuest One Sustainability
ProQuest Engineering Collection
ProQuest Central Korea
ProQuest Central (New)
ABI/INFORM Complete (Alumni Edition)
Engineering Collection
Advanced Technologies & Aerospace Collection
Business Premium Collection
Civil Engineering Abstracts
ABI/INFORM Global
Engineering Database
ABI/INFORM Global (Alumni Edition)
ProQuest Central Basic
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Business Collection
Advanced Technologies & Aerospace Database
ProQuest One Academic UKI Edition
Materials Science & Engineering Collection
ProQuest One Business (Alumni)
Engineering Research Database
ProQuest One Academic
Environment Abstracts
ProQuest One Academic (New)
ProQuest Central (Alumni)
Business Premium Collection (Alumni)
DatabaseTitleList CrossRef



Publicly Available Content Database
Database_xml – sequence: 1
  dbid: RHX
  name: Hindawi Publishing Open Access
  url: http://www.hindawi.com/journals/
  sourceTypes: Publisher
– sequence: 2
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 3
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2042-3195
Editor Yao, Zhihong
Editor_xml – sequence: 1
  givenname: Zhihong
  surname: Yao
  fullname: Yao, Zhihong
EndPage 16
ExternalDocumentID oai_doaj_org_article_b593da2cc5c248ad937380683f5794ad
A697663013
10_1155_2022_3825532
GeographicLocations China
GeographicLocations_xml – name: China
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61802019; 61932012; 61871039
– fundername: Beijing Municipal Education Commission Science and Technology
  grantid: KM201911417003; KM201911417009; KM201911417001
– fundername: Beijing Union University
  grantid: YZ2020K001; YZ2021K001
GroupedDBID -~X
..I
05W
0R~
1OC
24P
29J
3SF
4.4
52U
5GY
7WY
8-1
8FL
AAESR
AAFWJ
AAJEY
AAONW
ABDBF
ABJCF
ABUWG
ACCMX
ACIWK
ACNCT
ACUHS
ADBBV
ADIZJ
AENEX
AEUYN
AFBPY
AFKRA
AFPKN
AFRAH
AJXKR
ALMA_UNASSIGNED_HOLDINGS
ARAPS
ATUGU
AZVAB
BAAKF
BCNDV
BDRZF
BENPR
BEZIV
BGLVJ
BHBCM
BNHUX
BOGZA
BRXPI
CCPQU
DU5
DWQXO
EBS
ESX
FRNLG
G-S
GODZA
GROUPED_DOAJ
H13
HCIFZ
HZ~
I-F
IAO
IOF
ITC
LITHE
M0C
M7S
MY~
N95
O9-
OK1
P2P
PHGZT
PIMPY
PQBIZ
PQBZA
PTHSS
RHU
RHW
RHX
TN5
TUS
WBKPD
WH7
AAYXX
CITATION
PHGZM
PMFND
3V.
7ST
7XB
8FD
8FE
8FG
8FK
AAMMB
AEFGJ
AGXDD
AIDQK
AIDYY
AZQEC
C1K
FR3
K60
K6~
KR7
L.-
L6V
P62
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
SOI
PUEGO
ID FETCH-LOGICAL-c517t-8f5b5cd65bc9abf1b4c9d40accc786fb2698620d4beb786ecd91a00bfefb109b3
IEDL.DBID BENPR
ISSN 0197-6729
IngestDate Wed Aug 27 01:18:51 EDT 2025
Sun Jul 13 05:34:31 EDT 2025
Fri Jun 13 00:09:17 EDT 2025
Tue Jun 10 21:03:29 EDT 2025
Fri May 23 02:36:37 EDT 2025
Tue Jul 01 00:34:13 EDT 2025
Thu Apr 24 23:11:47 EDT 2025
Wed Apr 16 06:24:47 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
https://creativecommons.org/licenses/by/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c517t-8f5b5cd65bc9abf1b4c9d40accc786fb2698620d4beb786ecd91a00bfefb109b3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-7927-456X
0000-0002-6062-6166
0000-0002-2293-1004
OpenAccessLink https://www.proquest.com/docview/2638548232?pq-origsite=%requestingapplication%
PQID 2638548232
PQPubID 1006382
PageCount 16
ParticipantIDs doaj_primary_oai_doaj_org_article_b593da2cc5c248ad937380683f5794ad
proquest_journals_2638548232
gale_infotracgeneralonefile_A697663013
gale_infotracacademiconefile_A697663013
gale_businessinsightsgauss_A697663013
crossref_citationtrail_10_1155_2022_3825532
crossref_primary_10_1155_2022_3825532
hindawi_primary_10_1155_2022_3825532
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2022-03-01
PublicationDateYYYYMMDD 2022-03-01
PublicationDate_xml – month: 03
  year: 2022
  text: 2022-03-01
  day: 01
PublicationDecade 2020
PublicationPlace London
PublicationPlace_xml – name: London
PublicationTitle Journal of advanced transportation
PublicationYear 2022
Publisher Hindawi
John Wiley & Sons, Inc
Wiley
Publisher_xml – name: Hindawi
– name: John Wiley & Sons, Inc
– name: Wiley
References 44
45
47
49
K. Simonyan (33) 2015
I. Loshchilov (46) 2019
X. Glorot (48)
50
10
11
12
13
D. Hendrycks (41) 2019
16
17
18
19
S. Sabour (24)
1
A. Vaswani (3)
2
5
6
7
8
H. Zhang (9) 2020
20
21
22
23
25
26
27
A. Dosovitskiy (4) 2021
28
K. Sun (29) 2019
30
31
32
A. Bochkovskiy (14) 2020
34
35
J. Park (38) 2018
36
37
39
J. Redmon (15) 2018
40
42
43
References_xml – year: 2015
  ident: 33
  article-title: Very deep convolutional networks for large-scale image recognition
– year: 2019
  ident: 46
  article-title: Decoupled weight decay regularization
– ident: 17
  doi: 10.1109/CVPR.2016.91
– ident: 20
  doi: 10.1109/TPAMI.2020.3032166
– ident: 34
  doi: 10.1109/CVPR.2016.90
– ident: 6
  doi: 10.1109/CVPR46437.2021.01422
– ident: 7
  doi: 10.1109/CVPR.2014.81
– year: 2020
  ident: 9
  article-title: Resnest: split-attention networks
– ident: 43
  doi: 10.1109/TITS.2012.2209421
– ident: 13
  doi: 10.1109/UBMYK48245.2019.8965590
– ident: 30
  doi: 10.1109/CVPR42600.2020.01160
– ident: 37
  doi: 10.1109/TPAMI.2019.2913372
– ident: 22
  doi: 10.1109/JIOT.2020.3034899
– ident: 50
  doi: 10.1109/tnsm.2021.3098157
– ident: 42
  doi: 10.1109/IJCNN.2013.6706807
– ident: 47
  doi: 10.1145/3065386
– ident: 40
  doi: 10.1109/CVPR46437.2021.01350
– ident: 28
  doi: 10.1109/ACCESS.2020.3047414
– year: 2020
  ident: 14
  article-title: Yolov4: optimal speed and accuracy of object detection
– ident: 32
  doi: 10.1109/CVPR.2018.00644
– ident: 18
  doi: 10.1007/978-3-319-46448-0_2
– ident: 35
  doi: 10.1609/aaai.v34i07.6999
– ident: 8
  doi: 10.1109/TPAMI.2016.2577031
– year: 2018
  ident: 38
  article-title: BAM: bottleneck attention module
– ident: 25
  doi: 10.1109/ACCESS.2021.3094201
– ident: 27
  doi: 10.1109/TPAMI.2018.2844175
– ident: 23
  doi: 10.1109/ACCESS.2021.3059052
– ident: 49
  doi: 10.1109/CVPR.2018.00647
– start-page: 3859
  ident: 24
  article-title: Dynamic routing between capsules
– ident: 39
  doi: 10.1007/978-3-030-01234-2_1
– ident: 16
  doi: 10.1109/CVPR.2017.690
– ident: 44
  doi: 10.1109/CVPR.2016.232
– ident: 26
  doi: 10.1109/TITS.2019.2913588
– ident: 2
  doi: 10.1109/TCCN.2017.2758370
– start-page: 6000
  ident: 3
  article-title: Attention is all you need
– ident: 45
  doi: 10.3390/a10040127
– ident: 5
  doi: 10.1007/978-3-030-58452-8_13
– ident: 11
  doi: 10.1109/CVPR.2005.177
– ident: 12
  doi: 10.1109/5254.708428
– ident: 19
  doi: 10.1109/TPAMI.2018.2858826
– start-page: 249
  ident: 48
  article-title: Understanding the difficulty of training deep feedforward neural networks
– year: 2019
  ident: 29
  article-title: High-resolution representations for labeling pixels and regions
– year: 2021
  ident: 4
  article-title: An image is worth 16x16 words: transformers for image recognition at scale
– ident: 10
  doi: 10.1109/ICOSP.2014.7015147
– year: 2019
  ident: 41
  article-title: Benchmarking neural network robustness to common corruptions and surface variations
– ident: 36
  doi: 10.1109/CVPR.2017.634
– ident: 1
  doi: 10.1109/TNSM.2019.2899085
– ident: 31
  doi: 10.1007/978-3-319-48890-5_20
– ident: 21
  doi: 10.1109/ICCV.2015.169
– year: 2018
  ident: 15
  article-title: Yolov3: an incremental improvement
SSID ssj0039594
Score 2.513597
Snippet Traffic sign detection is an important component of autonomous vehicles. There is still a mismatch problem between the existing detection algorithm and its...
SourceID doaj
proquest
gale
crossref
hindawi
SourceType Open Website
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1
SubjectTerms Accuracy
Algorithms
Augmentation
Autonomous vehicles
Candidates
Computer vision
Data acquisition
Data collection
Datasets
Deep learning
Design
Driverless cars
Feature extraction
Methods
Proposals
Semantics
Signs
Traffic control
Traffic signs
Transportation
Vehicles
SummonAdditionalLinks – databaseName: DOAJ: Directory of Open Access Journal (DOAJ)
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LbxMxELZQpUr0gChQkRKQD4Ue0Kq7fmV9DIEqQiIHQlBvll-bRqq2UXcDf5-ZXSdqVFAvXPZgjWTvjD0znzX-hpAzq6uY5y5kZcVEJiCkZRZMnfFQQrAY8SAVPhT-NlPThfh6Ja_utfrCmrCeHrhX3IWTmgfLvJeeidIGjVQ8uSp5JWEr2YDeF2LeFkz1PphrqcW2zF1KRPjsggMYkpztBaCOp3_njQ-vEQf_Xj3wy12wuXxOnqUskY771R2TJ7F-QY7ucQe-JFOIMkj_QOerZU0_x7arqarpr5Wl_U1BDHS-Btwa6fdsMptRSE_peNPiKwaA-_RnvO5K4l6RxeWXH5NpltoiZF4Woxa0Kp30QUnntXVV4YTXQeTWez8qVeWY0gBT8iBcdDAQfdCFBXtUsXJFrh0_IQf1bR1fEypg1DttvVJR6ArSASEgpEnmRl6xQgzIx62ujE-c4di64sZ02EFKg5o1SbMD8n4nve65Mv4h9wnVvpNBhutuAOxukt3NY3aHydBoJvXqhE-DtxnN0m6axowVpFkKnBcfkPNODs8rLN3b9OwAFIDMV3uSH_Yklz3v998Ez9JGeeQ3h9tdZJJjaAwDfwcgEfLY0_-hhTfkKU7ZF8UNyUF7t4lvIUtq3bvuQPwBvQAK8g
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: Hindawi Publishing Open Access
  dbid: RHX
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JT-swELYACQkOiFWUTT6wHFD0knhJfCybKiR6YFNvlreUSk95iKTw9xknbkWBJ7hEijXZZuyZ-ZzxZ4QOlShcHGsb5UVKIwohLVJg6ojYHIJFRizjfqHwTZ_3Huj1gA0CSVL19Rc-RDsPz9M_BJAMI-Br56GDeVDeG0wcLhFMtBTeIos4JIuT-vZP185Enoagf-qGF588AH4bfXHITZS5WkUrIT3E3daea2jOleto-QNp4AbqQXjxvA_4bjQs8YWrm2KqEr-OFG6nCJzFd88AWB2-jc77fQx5Ke6Oa798AXA-fnRPTS3cJnq4urw_70VhP4TIsCSrQZ1MM2M500YoXSSaGmFprIwxWc4LnXIB-CS2VDsNDc5YkSgwROEKncRCky20UP4r3TbCFFqNFspw7qgoIA-gFGIZS3VmeJrQDjqd6EqaQBbu96z4KxvQwJj0mpVBsx10NJV-bkky_iN35tU-lfHU1k0DmFuGkSI1E8Sq1BhmUporKzz3UsxzUjDwHcrCw7zRZNikEw6Vn8aohmpcVbLLIb_i4LVIB500cn6gwqsbFdYbgAI85dWM5PGM5LAl_P5O8DB0lB8-c2_Si2TwCJVMwdEBOoQEdud3d9lFS_60rXfbQwv1y9jtQwJU64Om-78D6Qb6LQ
  priority: 102
  providerName: Hindawi Publishing
Title Traffic Sign Detection via Improved Sparse R-CNN for Autonomous Vehicles
URI https://dx.doi.org/10.1155/2022/3825532
https://www.proquest.com/docview/2638548232
https://doaj.org/article/b593da2cc5c248ad937380683f5794ad
Volume 2022
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwhV1Lb9NAEF7RVkhwQDzV0BLtocABrWp7H_aeUFoaIiQilFKU22pfTiMhJ9RO-fvM2utAxOviw2ZkJ7Pjbx6Z_QahEy1LnyTGkaLMGGHg0oiGrSbUFeAscuq4CAeFP07F5Ip9mPN5LLjVsa2yx8QWqN3Khhr5aQaGAtE1BABv199ImBoV_l2NIzT20AFAcAHJ18HZxfTTrMdiKrns2L1lTgTEkX3rO-ch689OKSRInGY7Tqnl7t8i9N3rkBt_X_6G1a0DGj9ED2LkiEfdVj9Cd3z1GN3_hU_wCZqA5wmUEPhyuajwO9-0fVYVvl1q3FUPvMOXa8hlPZ6R8-kUQ8iKR5smnGxYbWr8xV-3bXJP0dX44vP5hMRRCcTyNG9A09xw6wQ3VmpTpoZZ6ViirbV5IUqTCQmpS-KY8QYWvHUy1bBHpS9NmkhDn6H9alX5Q4QZrFojtRXCM1lCiMAYuDmemdyKLGUD9KbXlbKRRzyMs_iq2nyCcxU0q6JmB-jlVnrd8Wf8Re4sqH0rE1iv24XVzULFl0gZLqnTmbXcZqzQTgZapkQUtOQAK9rBw8KmqTi_Ey51qHDUC72pazUSEHoJADQ6QK9bufAOw1e3Oh5FAAUENqwdyVc7kouOC_xPgifRUP7zM497K1IRLGr107Sf__vjI3Qv3KxrgTtG-83Nxr-AmKgxQ7RXjN8Po_kP28oCXGeT-Q8hQwjc
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF6VIgQcEE8RKLCHFg7Iqr2veA8IhZaQ0jYH2qLeln05jYSSUCdU_Cl-IzN-BCJep158WI9s7-w817PfELJpdRHT1IUkL5hIBLi0xMJSJzzk4Cy6PEiFB4UPh2pwIt6fytM18r09C4Nlla1NrAx1mHrcI99mICgQXUMA8Hr2JcGuUfh3tW2hUYvFfvx2ASlb-WpvF9Z3i7H-2-OdQdJ0FUi8zLpz-CjppA9KOq-tKzInvA4itd77bq4Kx5SGKD8NwkUHA9EHnVmYThELl6XacXjuFXJVcK5Ro_L-u9bycy11jSWuu4mCqLUttJcS9xjYNod0THK24gKrTgFLf3DtDDPxi_FvnqFyd_3b5FYTp9JeLVh3yFqc3CU3f0EvvEcG4OcQgIIejUcTuhvnVVXXhH4dW1rvVcRAj2aQOUf6IdkZDikEyLS3mOM5iumipB_jWVWUd5-cXAoLH5D1yXQSHxIqYNQ7bb1SUegCAhIhwKlK5rpesUx0yMuWV8Y3qOXYPOOzqbIXKQ1y1jSc7ZCtJfWsRuv4C90bZPuSBjG2q4Hp-cg0Kmuc1DxY5r30TOQ2aASBSlXOCwlGzAZ4GS6aabqFwqXE_ZRyZBdlaXoKAj0F5pN3yIuKDi0GfLq3zcEHYABib61QPl-hHNXI438i3GwE5T_T3GilyDSmqTQ_FenRv28_I9cHx4cH5mBvuP-Y3MAH18V3G2R9fr6ITyAam7unlQpQ8umyde4HG_xENA
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3Nb9MwFLdGJxAcEJ-iMMCHDQ4oauLYTnxAqFtXdQyqaWNoN-OvdJVQWpaWiX-Nv47nxClUfJ12ycF5SuLn9-k8_x5C20oULo61jfKC0IiCS4sULHWU2hycRZZaxv1B4fdjPjqlb8_Y2Qb63p6F8WWVrU2sDbWdGb9H3iMgKBBdQwDQK0JZxNFg-Gb-JfIdpPyf1radRiMih-7bJaRv1euDAaz1DiHD_Q97oyh0GIgMS7IFfCDTzFjOtBFKF4mmRlgaK2NMlvNCEy4g4o8t1U7DgDNWJAqmVrhCJ7HQKTz3GtrMICuKO2hzd398dNz6gVQw0SCLiyziEMO2ZfeM-R0H0kshOWMpWXOIdd-AlXe4fu7z8svpb36idn7DO-h2iFpxvxGzu2jDlffQrV-wDO-jEXg9D0eBT6aTEg_coq7xKvHXqcLNzoWz-GQOebTDx9HeeIwhXMb95cKfqpgtK_zRndcleg_Q6ZUw8SHqlLPSPUKYwqjRQhnOHRUFhCeUgotlRGeGk4R20auWV9IEDHPfSuOzrHMZxqTnrAyc7aKdFfW8we74C92uZ_uKxiNu1wOzi4kMCiw1E6lVxBhmCM2VFR4SKuZ5WjAwacrCy_yiydA7FC6V312pJmpZVbLPIezjYEzTLnpZ03n7AZ9uVDgGAQzwSFxrlC_WKCcNDvmfCLeDoPxnmlutFMlgqCr5U60e__v2c3QD9E2-OxgfPkE3_XObSrwt1FlcLN1TCM0W-lnQAYw-XbXa_QC-rEnG
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Traffic+Sign+Detection+via+Improved+Sparse+R-CNN+for+Autonomous+Vehicles&rft.jtitle=Journal+of+advanced+transportation&rft.au=Liang%2C+Tianjiao&rft.au=Bao%2C+Hong&rft.au=Pan%2C+Weiguo&rft.au=Pan%2C+Feng&rft.date=2022-03-01&rft.pub=John+Wiley+%26+Sons%2C+Inc&rft.issn=0197-6729&rft.volume=2022&rft_id=info:doi/10.1155%2F2022%2F3825532&rft.externalDocID=A697663013
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0197-6729&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0197-6729&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0197-6729&client=summon