Traffic flow digital twin generation for highway scenario based on radar-camera paired fusion

Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow information perceived by each other. When the sensing technology is combined with the internet of vehicles, a sensor network all over the road can pr...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 13; no. 1; pp. 642 - 15
Main Authors Li, Yanbing, Zhang, Weichuan
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 12.01.2023
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow information perceived by each other. When the sensing technology is combined with the internet of vehicles, a sensor network all over the road can provide a large-scale of traffic flow data, thus providing a basis for building a traffic digital twin model. The digital twin can enable the traffic system not only to use past and present information, but also to predict traffic conditions, providing more effective optimization for autonomous driving and intelligent transportation, so as to make long-term rational planning of the overall traffic state and enhance the level of traffic intelligence. The current mainstream traffic sensors, namely radar and camera, have their own advantages, and the fusion of these two sensors can provide more accurate traffic flow data for the generation of digital twin model. In this paper, an end-to-end digital twin system implementation approach is proposed for highway scenarios. Starting from a paired radar-camera sensing system, a single-site radar-camera fusion framework is proposed, and then using the definition of a unified coordinate system, the traffic flow data between multiple sites is combined to form a dynamic real-time traffic flow digital twin model. The effectiveness of the digital twin building is verified based on the real-world traffic data.
AbstractList Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow information perceived by each other. When the sensing technology is combined with the internet of vehicles, a sensor network all over the road can provide a large-scale of traffic flow data, thus providing a basis for building a traffic digital twin model. The digital twin can enable the traffic system not only to use past and present information, but also to predict traffic conditions, providing more effective optimization for autonomous driving and intelligent transportation, so as to make long-term rational planning of the overall traffic state and enhance the level of traffic intelligence. The current mainstream traffic sensors, namely radar and camera, have their own advantages, and the fusion of these two sensors can provide more accurate traffic flow data for the generation of digital twin model. In this paper, an end-to-end digital twin system implementation approach is proposed for highway scenarios. Starting from a paired radar-camera sensing system, a single-site radar-camera fusion framework is proposed, and then using the definition of a unified coordinate system, the traffic flow data between multiple sites is combined to form a dynamic real-time traffic flow digital twin model. The effectiveness of the digital twin building is verified based on the real-world traffic data.
Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow information perceived by each other. When the sensing technology is combined with the internet of vehicles, a sensor network all over the road can provide a large-scale of traffic flow data, thus providing a basis for building a traffic digital twin model. The digital twin can enable the traffic system not only to use past and present information, but also to predict traffic conditions, providing more effective optimization for autonomous driving and intelligent transportation, so as to make long-term rational planning of the overall traffic state and enhance the level of traffic intelligence. The current mainstream traffic sensors, namely radar and camera, have their own advantages, and the fusion of these two sensors can provide more accurate traffic flow data for the generation of digital twin model. In this paper, an end-to-end digital twin system implementation approach is proposed for highway scenarios. Starting from a paired radar-camera sensing system, a single-site radar-camera fusion framework is proposed, and then using the definition of a unified coordinate system, the traffic flow data between multiple sites is combined to form a dynamic real-time traffic flow digital twin model. The effectiveness of the digital twin building is verified based on the real-world traffic data.Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow information perceived by each other. When the sensing technology is combined with the internet of vehicles, a sensor network all over the road can provide a large-scale of traffic flow data, thus providing a basis for building a traffic digital twin model. The digital twin can enable the traffic system not only to use past and present information, but also to predict traffic conditions, providing more effective optimization for autonomous driving and intelligent transportation, so as to make long-term rational planning of the overall traffic state and enhance the level of traffic intelligence. The current mainstream traffic sensors, namely radar and camera, have their own advantages, and the fusion of these two sensors can provide more accurate traffic flow data for the generation of digital twin model. In this paper, an end-to-end digital twin system implementation approach is proposed for highway scenarios. Starting from a paired radar-camera sensing system, a single-site radar-camera fusion framework is proposed, and then using the definition of a unified coordinate system, the traffic flow data between multiple sites is combined to form a dynamic real-time traffic flow digital twin model. The effectiveness of the digital twin building is verified based on the real-world traffic data.
Abstract Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow information perceived by each other. When the sensing technology is combined with the internet of vehicles, a sensor network all over the road can provide a large-scale of traffic flow data, thus providing a basis for building a traffic digital twin model. The digital twin can enable the traffic system not only to use past and present information, but also to predict traffic conditions, providing more effective optimization for autonomous driving and intelligent transportation, so as to make long-term rational planning of the overall traffic state and enhance the level of traffic intelligence. The current mainstream traffic sensors, namely radar and camera, have their own advantages, and the fusion of these two sensors can provide more accurate traffic flow data for the generation of digital twin model. In this paper, an end-to-end digital twin system implementation approach is proposed for highway scenarios. Starting from a paired radar-camera sensing system, a single-site radar-camera fusion framework is proposed, and then using the definition of a unified coordinate system, the traffic flow data between multiple sites is combined to form a dynamic real-time traffic flow digital twin model. The effectiveness of the digital twin building is verified based on the real-world traffic data.
ArticleNumber 642
Author Zhang, Weichuan
Li, Yanbing
Author_xml – sequence: 1
  givenname: Yanbing
  orcidid: 0000-0003-3564-6111
  surname: Li
  fullname: Li, Yanbing
  email: ybli1@bjtu.edu.cn
  organization: School of Electronic and Information Engineering, Beijing Jiaotong University
– sequence: 2
  givenname: Weichuan
  surname: Zhang
  fullname: Zhang, Weichuan
  organization: Institute for Integrated and Intelligent Systems, Griffith University
BackLink https://www.ncbi.nlm.nih.gov/pubmed/36635372$$D View this record in MEDLINE/PubMed
BookMark eNp9Uk1r3DAQFSWlSdP8gR6KoZdenOrTsi6FEvoRCPSSHosY25JXi9faSnaW5Nd3Nk6aj0N0kRi993gzb96SgzGOjpD3jJ4yKurPWTJl6pJyUXJdmaq8eUWOOJWq5ILzg0fvQ3KS85riUdxIZt6QQ1FVQgnNj8ifywTeh7bwQ9wVXejDBEMx7cJY9G50CaYQx8LHVKxCv9rBdZFbN0IKsWggu67A3wQdpLKFDcKLLYSEZT9nJL4jrz0M2Z3c3cfk9_dvl2c_y4tfP87Pvl6UrZJ0Kmvead_SpmuM40orIbxgjfFaGfCceSNB6kY5bNYxRyUVba1kx6rKiBZAiWNyvuh2EdZ2m8IG0rWNEOxtIabeQppCOzgLEmpKtZNNx2WjFThfU6kdQCUliqLWl0VrOzcb12G3U4LhiejTnzGsbB-vrKmFZlWNAp_uBFL8O7s82U3AoQ0DjC7O2WJcSmupqEDox2fQdZzTiKPao6Rh0tQaUR8eO_pv5T5FBNQLoE0x5-S8bTHGfXJoMAyWUbvfGbvsjMUx2tudsTdI5c-o9-ovksRCyggee5cebL_A-gdyONTg
CitedBy_id crossref_primary_10_3390_land14010083
crossref_primary_10_55186_2413046X_2024_9_6_279
crossref_primary_10_1016_j_adhoc_2024_103613
crossref_primary_10_3390_ijgi12100424
crossref_primary_10_1002_for_3213
crossref_primary_10_1088_1361_665X_ada596
crossref_primary_10_1038_s41598_024_71620_y
crossref_primary_10_3390_app14199109
crossref_primary_10_1108_ECAM_08_2024_1024
crossref_primary_10_1049_2024_5556238
Cites_doi 10.1016/j.automatica.2004.01.014
10.1109/JSTSP.2021.3063666
10.1109/TITS.2022.3160932
10.1109/TIM.2022.3154001
10.1109/IEEESTD.2009.5226540
10.1109/TII.2018.2854901
10.1109/ICCSCE.2011.6190524
10.1109/TVT.2019.2906509
10.1007/s11263-021-01513-4
10.1109/CC.2014.6969789
10.1109/TPAMI.2006.153
10.1109/TIP.2005.863021
10.1109/TITS.2016.2533542
10.1109/ACCESS.2021.3083503
10.1109/JPROC.2019.2961937
10.1109/MFI.2008.4648063
10.1109/TII.2021.3083596
10.1109/7.7186
10.1109/ICECube53880.2021.9628341
10.1109/MSP.2019.2911722
10.1109/ACCESS.2019.2962554
10.1109/TITS.2020.3023957
10.23919/ICIF.2018.8455850
10.1109/CVPR.2016.91
10.1109/ICNP52444.2021.9651970
10.1109/LSENS.2022.3213529
10.1109/TITS.2014.2342875
10.1155/2008/246309
10.1109/MCOMSTD.001.2000069
10.1109/MSP.2020.2978507
10.1109/MCOMSTD.011.2100004
10.1109/TPAMI.2011.248
10.1109/TPAMI.2010.46
10.1109/7.766925
10.1109/ICIF.2002.1021176
10.23919/ICCAS.2017.8204375
10.1109/TITS.2021.3125358
ContentType Journal Article
Copyright The Author(s) 2023
2023. The Author(s).
The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: The Author(s) 2023
– notice: 2023. The Author(s).
– notice: The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID C6C
AAYXX
CITATION
NPM
3V.
7X7
7XB
88A
88E
88I
8FE
8FH
8FI
8FJ
8FK
ABUWG
AEUYN
AFKRA
AZQEC
BBNVY
BENPR
BHPHI
CCPQU
DWQXO
FYUFA
GHDGH
GNUQQ
HCIFZ
K9.
LK8
M0S
M1P
M2P
M7P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQGLB
PQQKQ
PQUKI
Q9U
7X8
5PM
DOA
DOI 10.1038/s41598-023-27696-z
DatabaseName Springer Nature OA Free Journals
CrossRef
PubMed
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Biology Database (Alumni Edition)
Medical Database (Alumni Edition)
Science Database (Alumni Edition)
ProQuest SciTech Collection
ProQuest Natural Science Collection
ProQuest Hospital Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest One Sustainability (subscription)
ProQuest Central UK/Ireland
ProQuest Central Essentials
Biological Science Collection
ProQuest Central
Natural Science Collection
ProQuest One Community College
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Central Student
SciTech Premium Collection
ProQuest Health & Medical Complete (Alumni)
Biological Sciences
ProQuest Health & Medical Collection
PML(ProQuest Medical Library)
Science Database
Biological Science Database
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central Basic
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
PubMed
Publicly Available Content Database
ProQuest Central Student
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Natural Science Collection
ProQuest Biology Journals (Alumni Edition)
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest One Sustainability
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
Natural Science Collection
ProQuest Central Korea
Health & Medical Research Collection
Biological Science Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest Science Journals (Alumni Edition)
ProQuest Biological Science Collection
ProQuest Central Basic
ProQuest Science Journals
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
Biological Science Database
ProQuest SciTech Collection
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList PubMed

Publicly Available Content Database
MEDLINE - Academic
CrossRef


Database_xml – sequence: 1
  dbid: C6C
  name: Springer Nature OA Free Journals
  url: http://www.springeropen.com/
  sourceTypes: Publisher
– sequence: 2
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 3
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 4
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Biology
EISSN 2045-2322
EndPage 15
ExternalDocumentID oai_doaj_org_article_a4a8007e4bd24b75aef8047eaa64454d
PMC9837168
36635372
10_1038_s41598_023_27696_z
Genre Journal Article
GrantInformation_xml – fundername: the Fundamental Research Funds for the Central Universities 2022RC008
– fundername: ;
GroupedDBID 0R~
3V.
4.4
53G
5VS
7X7
88A
88E
88I
8FE
8FH
8FI
8FJ
AAFWJ
AAJSJ
AAKDD
ABDBF
ABUWG
ACGFS
ACSMW
ACUHS
ADBBV
ADRAZ
AENEX
AEUYN
AFKRA
AJTQC
ALIPV
ALMA_UNASSIGNED_HOLDINGS
AOIJS
AZQEC
BAWUL
BBNVY
BCNDV
BENPR
BHPHI
BPHCQ
BVXVI
C6C
CCPQU
DIK
DWQXO
EBD
EBLON
EBS
ESX
FYUFA
GNUQQ
GROUPED_DOAJ
GX1
HCIFZ
HH5
HMCUK
HYE
KQ8
LK8
M0L
M1P
M2P
M48
M7P
M~E
NAO
OK1
PIMPY
PQQKQ
PROAC
PSQYO
RNT
RNTTT
RPM
SNYQT
UKHRP
AASML
AAYXX
AFPKN
CITATION
PHGZM
PHGZT
NPM
7XB
8FK
K9.
PJZUB
PKEHL
PPXIY
PQEST
PQGLB
PQUKI
Q9U
7X8
5PM
PUEGO
ID FETCH-LOGICAL-c540t-82d7fc0bdb9e257533f31b9f759af21f94a47b5e023e1e0403c854d16693caa53
IEDL.DBID M48
ISSN 2045-2322
IngestDate Wed Aug 27 01:28:11 EDT 2025
Thu Aug 21 18:38:40 EDT 2025
Mon Jul 21 11:10:49 EDT 2025
Sat Aug 23 13:25:23 EDT 2025
Thu Jan 02 22:53:25 EST 2025
Tue Jul 01 00:55:45 EDT 2025
Thu Apr 24 22:55:38 EDT 2025
Fri Feb 21 02:39:59 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1
Language English
License 2023. The Author(s).
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c540t-82d7fc0bdb9e257533f31b9f759af21f94a47b5e023e1e0403c854d16693caa53
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-3564-6111
OpenAccessLink http://journals.scholarsportal.info/openUrl.xqy?doi=10.1038/s41598-023-27696-z
PMID 36635372
PQID 2764914987
PQPubID 2041939
PageCount 15
ParticipantIDs doaj_primary_oai_doaj_org_article_a4a8007e4bd24b75aef8047eaa64454d
pubmedcentral_primary_oai_pubmedcentral_nih_gov_9837168
proquest_miscellaneous_2765774503
proquest_journals_2764914987
pubmed_primary_36635372
crossref_citationtrail_10_1038_s41598_023_27696_z
crossref_primary_10_1038_s41598_023_27696_z
springer_journals_10_1038_s41598_023_27696_z
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2023-01-12
PublicationDateYYYYMMDD 2023-01-12
PublicationDate_xml – month: 01
  year: 2023
  text: 2023-01-12
  day: 12
PublicationDecade 2020
PublicationPlace London
PublicationPlace_xml – name: London
– name: England
PublicationTitle Scientific reports
PublicationTitleAbbrev Sci Rep
PublicationTitleAlternate Sci Rep
PublicationYear 2023
Publisher Nature Publishing Group UK
Nature Publishing Group
Nature Portfolio
Publisher_xml – name: Nature Publishing Group UK
– name: Nature Publishing Group
– name: Nature Portfolio
References Ghorai, P., Eskandarian, A., Kim, Y.-K. & Mehr, G. State estimation and motion prediction of vehicles and vulnerable road users for cooperative autonomous driving: A survey. IEEE Trans. Intell. Transp. Syst. 1–20, https://doi.org/10.1109/TITS.2022.3160932 (2022).
ChenL-WHoY-FCentimeter-grade metropolitan positioning for lane-level intelligent transportation systems based on the internet of vehiclesIEEE Trans. Ind. Inform.2019151474148510.1109/TII.2018.2854901
BagheriH5G NR-V2X: Toward connected and cooperative autonomous drivingIEEE Commun. Stand. Mag.20215485410.1109/MCOMSTD.001.2000069
RoeckerJMcGillemCComparison of two-sensor tracking methods based on state vector fusion and measurement fusionIEEE Trans. Aerosp. Electron. Syst.1988244474491988ITAES..24..447R10.1109/7.7186
BernardinKStiefelhagenREvaluating multiple object tracking performance: The CLEAR MOT metricsEURASIP J. Image Video Process.2008200811010.1155/2008/246309
WangXXuLSunHXinJZhengNOn-road vehicle detection and tracking using mmw radar and monovision fusionIEEE Trans. Intell. Transp. Syst.2016172075208410.1109/TITS.2016.2533542
LiuPYuGWangZZhouBChenPObject classification based on enhanced evidence theory: Radar-vision fusion approach for roadside applicationIEEE Trans. Instrum. Meas.20227111210.1109/TIM.2022.3154001
IEEE standard for inertial systems terminology. IEEE Std1559–2009, 1–40. https://doi.org/10.1109/IEEESTD.2009.5226540 (2009).
MyronenkoASongXPoint set registration: Coherent point driftIEEE Trans. Pattern Anal. Mach. Intell.2010322262227510.1109/TPAMI.2010.46
Matzka, S. & Altendorfer, R. A comparison of track-to-track fusion algorithms for automotive sensor fusion. In 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 189–194, https://doi.org/10.1109/MFI.2008.4648063 (2008).
SunSPetropuluAPPoorHVMIMO radar for advanced driver-assistance systems and autonomous driving: Advantages and challengesIEEE Signal Process. Mag.2020379811710.1109/MSP.2020.2978507
Bouain, M., Berdjag, D., Fakhfakh, N. & Atitallah, R. B. Multi-sensor fusion for obstacle detection and recognition: A belief-based approach. In 2018 21st International Conference on Information Fusion (FUSION), 1217–1224, https://doi.org/10.23919/ICIF.2018.8455850 (2018).
Omar, S. & Winberg, S. Multisensor data fusion: Target tracking with a doppler radar and an electro-optic camera. In 2011 IEEE International Conference on Control System, Computing and Engineering, 210–215, https://doi.org/10.1109/ICCSCE.2011.6190524 (2011).
Li, I. & Georganas, J. Multi-target multi-platform sensor registration in geodetic coordinates. In Proceedings of the Fifth International Conference on Information Fusion. FUSION 2002., vol. 1, 366–373 (IEEE, 2002).
AlmeaibedSAl-RubayeSTsourdosAAvdelidisNPDigital twin analysis to promote safety and security in autonomous vehiclesIEEE Commun. Stand. Mag.20215404610.1109/MCOMSTD.011.2100004
YangFWangSLiJLiuZSunQAn overview of internet of vehiclesChina Commun.2014111152014OptCo.330....1Y10.1109/CC.2014.6969789
Maier-HeinLConvergent iterative closest-point algorithm to accomodate anisotropic and inhomogenous localization errorIEEE Trans. Pattern Anal. Mach. Intell.2011341520153210.1109/TPAMI.2011.248
SenguptaAChengLCaoSRobust multiobject tracking using mmwave radar-camera sensor fusionIEEE Sens. Lett.202261410.1109/LSENS.2022.3213529
Lee, S., Jung, Y., Park, Y.-H. & Kim, S.-W. Design of V2X-based vehicular contents centric networks for autonomous driving. IEEE Trans. Intell. Transp. Syst. 1–12, https://doi.org/10.1109/TITS.2021.3125358 (2021).
HakobyanGYangBHigh-performance automotive radar: A review of signal processing algorithms and modulation schemesIEEE Signal Process. Mag.20193632442019ISPM...36E..32H10.1109/MSP.2019.2911722
ZhangYWangCWangXZengWLiuWFairmot: On the fairness of detection and re-identification in multiple object trackingInt. J. Comput. Vis.20211293069308710.1007/s11263-021-01513-4
ZhouHXuWChenJWangWEvolutionary V2X technologies toward the internet of vehicles: Challenges and opportunitiesProc. IEEE202010830832310.1109/JPROC.2019.2961937
ThombreSSensors and AI techniques for situational awareness in autonomous ships: A reviewIEEE Trans. Intell. Transp. Syst.202223648310.1109/TITS.2020.3023957
Du, H., Leng, S., He, J. & Zhou, L. Digital twin based trajectory prediction for platoons of connected intelligent vehicles. In 2021 IEEE 29th International Conference on Network Protocols, 1–6, https://doi.org/10.1109/ICNP52444.2021.9651970 (2021).
LiuTDuSLiangCZhangBFengRA novel multi-sensor fusion based object detection and recognition algorithm for intelligent assisted drivingIEEE Access20219815648157410.1109/ACCESS.2021.3083503
ZhouYLeungHBlanchetteMSensor alignment with earth-centered earth-fixed (ECEF) coordinate systemIEEE Trans. Aerosp. Electron. Syst.1999354104181999ITAES..35..410Z10.1109/7.766925
Redmon, J., Divvala, S., Girshick, R. & Farhadi, A. You only look once: Unified, real-time object detection. In IEEE Conference on Computer Vision and Pattern Recognition, 779–788, https://doi.org/10.1109/CVPR.2016.91 (2016).
GuoK-YHoareEGJastehDShengX-QGashinovaMRoad edge recognition using the stripe Hough transform from millimeter-wave radar imagesIEEE Trans. Intell. Transp. Syst.20151682583310.1109/TITS.2014.2342875
DengRDiBSongLCooperative collision avoidance for overtaking maneuvers in cellular V2X-based autonomous drivingIEEE Trans. Veh. Technol.2019684434444610.1109/TVT.2019.2906509
SunS-LDengZ-LMulti-sensor optimal information fusion Kalman filterAutomatica20044010171023215150910.1016/j.automatica.2004.01.0141075.93037
EngelsFAutomotive radar signal processing: Research directions and practical challengesIEEE J. Sel. Top. Signal Process.2021158658782021ISTSP..15..865E10.1109/JSTSP.2021.3063666
WangZWuYNiuQMulti-sensor fusion in automated driving: A surveyIEEE Access202082847286810.1109/ACCESS.2019.2962554
HuCDigital twin-assisted real-time traffic data prediction method for 5G-enabled internet of vehiclesIEEE Trans. Ind. Inform.2022182811281910.1109/TII.2021.3083596
Niaz, A. et al. Autonomous driving test method based on digital twin: A survey. In 2021 International Conference on Computing, Electronic and Electrical Engineering, 1–7, https://doi.org/10.1109/ICECube53880.2021.9628341 (2021).
AggarwalNKarlWLine detection in images through regularized Hough transformIEEE Trans. Image Process.2006155825912006ITIP...15..582A10.1109/TIP.2005.863021
KannalaJBrandtSA generic camera model and calibration method for conventional, wide-angle, and fish-eye lensesIEEE Trans. Pattern Anal. Mach. Intell.2006281335134010.1109/TPAMI.2006.153
Kim, K.-E., Lee, C.-J., Pae, D.-S. & Lim, M.-T. Sensor fusion for vehicle tracking with camera and radar sensor. In 2017 17th International Conference on Control, Automation and Systems (ICCAS), 1075–1077, https://doi.org/10.23919/ICCAS.2017.8204375 (2017).
T Liu (27696_CR17) 2021; 9
P Liu (27696_CR16) 2022; 71
L-W Chen (27696_CR11) 2019; 15
A Sengupta (27696_CR21) 2022; 6
S Thombre (27696_CR34) 2022; 23
27696_CR19
H Zhou (27696_CR5) 2020; 108
27696_CR18
F Engels (27696_CR22) 2021; 15
G Hakobyan (27696_CR12) 2019; 36
S Sun (27696_CR23) 2020; 37
K Bernardin (27696_CR37) 2008; 2008
27696_CR14
K-Y Guo (27696_CR33) 2015; 16
27696_CR20
N Aggarwal (27696_CR32) 2006; 15
A Myronenko (27696_CR31) 2010; 32
C Hu (27696_CR8) 2022; 18
Y Zhou (27696_CR27) 1999; 35
27696_CR9
27696_CR28
27696_CR1
27696_CR29
S-L Sun (27696_CR36) 2004; 40
27696_CR3
F Yang (27696_CR6) 2014; 11
27696_CR25
27696_CR7
Z Wang (27696_CR13) 2020; 8
J Kannala (27696_CR24) 2006; 28
X Wang (27696_CR15) 2016; 17
L Maier-Hein (27696_CR30) 2011; 34
Y Zhang (27696_CR26) 2021; 129
J Roecker (27696_CR35) 1988; 24
H Bagheri (27696_CR2) 2021; 5
S Almeaibed (27696_CR10) 2021; 5
R Deng (27696_CR4) 2019; 68
References_xml – reference: SenguptaAChengLCaoSRobust multiobject tracking using mmwave radar-camera sensor fusionIEEE Sens. Lett.202261410.1109/LSENS.2022.3213529
– reference: HakobyanGYangBHigh-performance automotive radar: A review of signal processing algorithms and modulation schemesIEEE Signal Process. Mag.20193632442019ISPM...36E..32H10.1109/MSP.2019.2911722
– reference: WangZWuYNiuQMulti-sensor fusion in automated driving: A surveyIEEE Access202082847286810.1109/ACCESS.2019.2962554
– reference: LiuPYuGWangZZhouBChenPObject classification based on enhanced evidence theory: Radar-vision fusion approach for roadside applicationIEEE Trans. Instrum. Meas.20227111210.1109/TIM.2022.3154001
– reference: BernardinKStiefelhagenREvaluating multiple object tracking performance: The CLEAR MOT metricsEURASIP J. Image Video Process.2008200811010.1155/2008/246309
– reference: Kim, K.-E., Lee, C.-J., Pae, D.-S. & Lim, M.-T. Sensor fusion for vehicle tracking with camera and radar sensor. In 2017 17th International Conference on Control, Automation and Systems (ICCAS), 1075–1077, https://doi.org/10.23919/ICCAS.2017.8204375 (2017).
– reference: ZhangYWangCWangXZengWLiuWFairmot: On the fairness of detection and re-identification in multiple object trackingInt. J. Comput. Vis.20211293069308710.1007/s11263-021-01513-4
– reference: ThombreSSensors and AI techniques for situational awareness in autonomous ships: A reviewIEEE Trans. Intell. Transp. Syst.202223648310.1109/TITS.2020.3023957
– reference: DengRDiBSongLCooperative collision avoidance for overtaking maneuvers in cellular V2X-based autonomous drivingIEEE Trans. Veh. Technol.2019684434444610.1109/TVT.2019.2906509
– reference: AggarwalNKarlWLine detection in images through regularized Hough transformIEEE Trans. Image Process.2006155825912006ITIP...15..582A10.1109/TIP.2005.863021
– reference: GuoK-YHoareEGJastehDShengX-QGashinovaMRoad edge recognition using the stripe Hough transform from millimeter-wave radar imagesIEEE Trans. Intell. Transp. Syst.20151682583310.1109/TITS.2014.2342875
– reference: LiuTDuSLiangCZhangBFengRA novel multi-sensor fusion based object detection and recognition algorithm for intelligent assisted drivingIEEE Access20219815648157410.1109/ACCESS.2021.3083503
– reference: Maier-HeinLConvergent iterative closest-point algorithm to accomodate anisotropic and inhomogenous localization errorIEEE Trans. Pattern Anal. Mach. Intell.2011341520153210.1109/TPAMI.2011.248
– reference: SunS-LDengZ-LMulti-sensor optimal information fusion Kalman filterAutomatica20044010171023215150910.1016/j.automatica.2004.01.0141075.93037
– reference: KannalaJBrandtSA generic camera model and calibration method for conventional, wide-angle, and fish-eye lensesIEEE Trans. Pattern Anal. Mach. Intell.2006281335134010.1109/TPAMI.2006.153
– reference: Redmon, J., Divvala, S., Girshick, R. & Farhadi, A. You only look once: Unified, real-time object detection. In IEEE Conference on Computer Vision and Pattern Recognition, 779–788, https://doi.org/10.1109/CVPR.2016.91 (2016).
– reference: ZhouHXuWChenJWangWEvolutionary V2X technologies toward the internet of vehicles: Challenges and opportunitiesProc. IEEE202010830832310.1109/JPROC.2019.2961937
– reference: Lee, S., Jung, Y., Park, Y.-H. & Kim, S.-W. Design of V2X-based vehicular contents centric networks for autonomous driving. IEEE Trans. Intell. Transp. Syst. 1–12, https://doi.org/10.1109/TITS.2021.3125358 (2021).
– reference: Bouain, M., Berdjag, D., Fakhfakh, N. & Atitallah, R. B. Multi-sensor fusion for obstacle detection and recognition: A belief-based approach. In 2018 21st International Conference on Information Fusion (FUSION), 1217–1224, https://doi.org/10.23919/ICIF.2018.8455850 (2018).
– reference: ZhouYLeungHBlanchetteMSensor alignment with earth-centered earth-fixed (ECEF) coordinate systemIEEE Trans. Aerosp. Electron. Syst.1999354104181999ITAES..35..410Z10.1109/7.766925
– reference: Li, I. & Georganas, J. Multi-target multi-platform sensor registration in geodetic coordinates. In Proceedings of the Fifth International Conference on Information Fusion. FUSION 2002., vol. 1, 366–373 (IEEE, 2002).
– reference: BagheriH5G NR-V2X: Toward connected and cooperative autonomous drivingIEEE Commun. Stand. Mag.20215485410.1109/MCOMSTD.001.2000069
– reference: Matzka, S. & Altendorfer, R. A comparison of track-to-track fusion algorithms for automotive sensor fusion. In 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 189–194, https://doi.org/10.1109/MFI.2008.4648063 (2008).
– reference: HuCDigital twin-assisted real-time traffic data prediction method for 5G-enabled internet of vehiclesIEEE Trans. Ind. Inform.2022182811281910.1109/TII.2021.3083596
– reference: Omar, S. & Winberg, S. Multisensor data fusion: Target tracking with a doppler radar and an electro-optic camera. In 2011 IEEE International Conference on Control System, Computing and Engineering, 210–215, https://doi.org/10.1109/ICCSCE.2011.6190524 (2011).
– reference: WangXXuLSunHXinJZhengNOn-road vehicle detection and tracking using mmw radar and monovision fusionIEEE Trans. Intell. Transp. Syst.2016172075208410.1109/TITS.2016.2533542
– reference: Du, H., Leng, S., He, J. & Zhou, L. Digital twin based trajectory prediction for platoons of connected intelligent vehicles. In 2021 IEEE 29th International Conference on Network Protocols, 1–6, https://doi.org/10.1109/ICNP52444.2021.9651970 (2021).
– reference: EngelsFAutomotive radar signal processing: Research directions and practical challengesIEEE J. Sel. Top. Signal Process.2021158658782021ISTSP..15..865E10.1109/JSTSP.2021.3063666
– reference: AlmeaibedSAl-RubayeSTsourdosAAvdelidisNPDigital twin analysis to promote safety and security in autonomous vehiclesIEEE Commun. Stand. Mag.20215404610.1109/MCOMSTD.011.2100004
– reference: MyronenkoASongXPoint set registration: Coherent point driftIEEE Trans. Pattern Anal. Mach. Intell.2010322262227510.1109/TPAMI.2010.46
– reference: ChenL-WHoY-FCentimeter-grade metropolitan positioning for lane-level intelligent transportation systems based on the internet of vehiclesIEEE Trans. Ind. Inform.2019151474148510.1109/TII.2018.2854901
– reference: RoeckerJMcGillemCComparison of two-sensor tracking methods based on state vector fusion and measurement fusionIEEE Trans. Aerosp. Electron. Syst.1988244474491988ITAES..24..447R10.1109/7.7186
– reference: YangFWangSLiJLiuZSunQAn overview of internet of vehiclesChina Commun.2014111152014OptCo.330....1Y10.1109/CC.2014.6969789
– reference: Ghorai, P., Eskandarian, A., Kim, Y.-K. & Mehr, G. State estimation and motion prediction of vehicles and vulnerable road users for cooperative autonomous driving: A survey. IEEE Trans. Intell. Transp. Syst. 1–20, https://doi.org/10.1109/TITS.2022.3160932 (2022).
– reference: Niaz, A. et al. Autonomous driving test method based on digital twin: A survey. In 2021 International Conference on Computing, Electronic and Electrical Engineering, 1–7, https://doi.org/10.1109/ICECube53880.2021.9628341 (2021).
– reference: IEEE standard for inertial systems terminology. IEEE Std1559–2009, 1–40. https://doi.org/10.1109/IEEESTD.2009.5226540 (2009).
– reference: SunSPetropuluAPPoorHVMIMO radar for advanced driver-assistance systems and autonomous driving: Advantages and challengesIEEE Signal Process. Mag.2020379811710.1109/MSP.2020.2978507
– volume: 40
  start-page: 1017
  year: 2004
  ident: 27696_CR36
  publication-title: Automatica
  doi: 10.1016/j.automatica.2004.01.014
– volume: 15
  start-page: 865
  year: 2021
  ident: 27696_CR22
  publication-title: IEEE J. Sel. Top. Signal Process.
  doi: 10.1109/JSTSP.2021.3063666
– ident: 27696_CR3
  doi: 10.1109/TITS.2022.3160932
– volume: 71
  start-page: 1
  year: 2022
  ident: 27696_CR16
  publication-title: IEEE Trans. Instrum. Meas.
  doi: 10.1109/TIM.2022.3154001
– ident: 27696_CR29
  doi: 10.1109/IEEESTD.2009.5226540
– volume: 15
  start-page: 1474
  year: 2019
  ident: 27696_CR11
  publication-title: IEEE Trans. Ind. Inform.
  doi: 10.1109/TII.2018.2854901
– ident: 27696_CR19
  doi: 10.1109/ICCSCE.2011.6190524
– volume: 68
  start-page: 4434
  year: 2019
  ident: 27696_CR4
  publication-title: IEEE Trans. Veh. Technol.
  doi: 10.1109/TVT.2019.2906509
– volume: 129
  start-page: 3069
  year: 2021
  ident: 27696_CR26
  publication-title: Int. J. Comput. Vis.
  doi: 10.1007/s11263-021-01513-4
– volume: 11
  start-page: 1
  year: 2014
  ident: 27696_CR6
  publication-title: China Commun.
  doi: 10.1109/CC.2014.6969789
– volume: 28
  start-page: 1335
  year: 2006
  ident: 27696_CR24
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2006.153
– volume: 15
  start-page: 582
  year: 2006
  ident: 27696_CR32
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2005.863021
– volume: 17
  start-page: 2075
  year: 2016
  ident: 27696_CR15
  publication-title: IEEE Trans. Intell. Transp. Syst.
  doi: 10.1109/TITS.2016.2533542
– volume: 9
  start-page: 81564
  year: 2021
  ident: 27696_CR17
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2021.3083503
– volume: 108
  start-page: 308
  year: 2020
  ident: 27696_CR5
  publication-title: Proc. IEEE
  doi: 10.1109/JPROC.2019.2961937
– ident: 27696_CR18
  doi: 10.1109/MFI.2008.4648063
– volume: 18
  start-page: 2811
  year: 2022
  ident: 27696_CR8
  publication-title: IEEE Trans. Ind. Inform.
  doi: 10.1109/TII.2021.3083596
– volume: 24
  start-page: 447
  year: 1988
  ident: 27696_CR35
  publication-title: IEEE Trans. Aerosp. Electron. Syst.
  doi: 10.1109/7.7186
– ident: 27696_CR9
  doi: 10.1109/ICECube53880.2021.9628341
– volume: 36
  start-page: 32
  year: 2019
  ident: 27696_CR12
  publication-title: IEEE Signal Process. Mag.
  doi: 10.1109/MSP.2019.2911722
– volume: 8
  start-page: 2847
  year: 2020
  ident: 27696_CR13
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2019.2962554
– volume: 23
  start-page: 64
  year: 2022
  ident: 27696_CR34
  publication-title: IEEE Trans. Intell. Transp. Syst.
  doi: 10.1109/TITS.2020.3023957
– ident: 27696_CR14
  doi: 10.23919/ICIF.2018.8455850
– ident: 27696_CR25
  doi: 10.1109/CVPR.2016.91
– ident: 27696_CR7
  doi: 10.1109/ICNP52444.2021.9651970
– volume: 6
  start-page: 1
  year: 2022
  ident: 27696_CR21
  publication-title: IEEE Sens. Lett.
  doi: 10.1109/LSENS.2022.3213529
– volume: 16
  start-page: 825
  year: 2015
  ident: 27696_CR33
  publication-title: IEEE Trans. Intell. Transp. Syst.
  doi: 10.1109/TITS.2014.2342875
– volume: 2008
  start-page: 1
  year: 2008
  ident: 27696_CR37
  publication-title: EURASIP J. Image Video Process.
  doi: 10.1155/2008/246309
– volume: 5
  start-page: 48
  year: 2021
  ident: 27696_CR2
  publication-title: IEEE Commun. Stand. Mag.
  doi: 10.1109/MCOMSTD.001.2000069
– volume: 37
  start-page: 98
  year: 2020
  ident: 27696_CR23
  publication-title: IEEE Signal Process. Mag.
  doi: 10.1109/MSP.2020.2978507
– volume: 5
  start-page: 40
  year: 2021
  ident: 27696_CR10
  publication-title: IEEE Commun. Stand. Mag.
  doi: 10.1109/MCOMSTD.011.2100004
– volume: 34
  start-page: 1520
  year: 2011
  ident: 27696_CR30
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2011.248
– volume: 32
  start-page: 2262
  year: 2010
  ident: 27696_CR31
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2010.46
– volume: 35
  start-page: 410
  year: 1999
  ident: 27696_CR27
  publication-title: IEEE Trans. Aerosp. Electron. Syst.
  doi: 10.1109/7.766925
– ident: 27696_CR28
  doi: 10.1109/ICIF.2002.1021176
– ident: 27696_CR20
  doi: 10.23919/ICCAS.2017.8204375
– ident: 27696_CR1
  doi: 10.1109/TITS.2021.3125358
SSID ssj0000529419
Score 2.4415529
Snippet Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow...
Abstract Autonomous driving is gradually moving from single-vehicle intelligence to internet of vehicles, where traffic participants can share the traffic flow...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
springer
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 642
SubjectTerms 639/166
639/166/987
Cameras
Digital twins
Humanities and Social Sciences
Intelligence
multidisciplinary
Radar
Science
Science (multidisciplinary)
Sensors
Traffic flow
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1baxQxFA5SEHwR706tEsE3DZ3cJsljFUsR9MlCXyScTJK6UGbLdpfS_vqeZGbXrtcXXycZEs4l5zu5fIeQNzbw2AWRWRYhMhWEZqCDYjYZY4IFCbVKxOcv3dGx-nSiT26V-ip3wkZ64FFw-6AAMY1JKkShgtGQsm2VSQAYybWKZfXFmHcrmRpZvYVT3E2vZFpp9y8wUpXXZEIyYTrXseutSFQJ-3-HMn-9LPnTiWkNRIcPyP0JQdKDceYPyZ00PCJ3x5qSV4_JN4w-hRaC5rP5JY2z01IVhC4vZwM9rRTTRRMUoSqtTMVwRQufE2bMc1oiWqTYuoAIC9ZD2bCi54CrYqR5VfbVnpDjw49fPxyxqYYC6xGLLZkV0eS-DTG4hN6J4C5LHlw22kEWPDsFygSdUC6JJ_Ro2VuUKu86J3sALZ-SnWE-pOeEQuAiOICQDKYVVtnehSRTiBllnW3fEL6Wp-8ngvFS5-LM14Nuaf2oA49j-aoDf92Qt5t_zkd6jb_2fl_UtOlZqLHrBzQYPxmM_5fBNGRvrWQ_-etFGUA5TBatacjrTTN6Wjk-gSHNV7WPRrCsW9mQZ6NNbGYiC3CTRjTEbFnL1lS3W4bZ98rm7azEnNU25N3arn5M68-i2P0fonhB7oniEC1nXOyRneVilV4ixlqGV9WdbgBJyiQ6
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: Health & Medical Collection
  dbid: 7X7
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELagCIkL4t1AQUbiBlYT24ntEwJEVSHBiUp7QdY4tpeVqmTZh6r21zN2sqmWR6-xk9ieGc_nmeQbQt5oV_nG8cgid55Jx2sGtZNMB6WU0yAgV4n4-q05PZNfZvVsDLitx88qd3ti3qh936YY-TFXjTQI57V6v_zFUtWolF0dS2jcJncSdVnSajVTU4wlZbFkZcZ_ZUqhj9for9I_ZVwwfKBp2NWeP8q0_f_Cmn9_MvlH3jS7o5MH5P6II-mHQfAPya3QPSJ3h8qSl4_JD_RBiRyCxvP-gvrFPNUGoZuLRUfnmWg6yYMiYKWZrxguaWJ1wnNzT5Nf8xRbV-BhxVpIYSu6BNwbPY3bFF17Qs5OPn__dMrGSgqsRUS2YZp7FdvSeWcC2ihCvCgqZ6KqDUReRSNBKlcHXJdQBbRr0epa-qppjGgBavGUHHR9Fw4JBVdxZwBcUHi40FK3xgURnI-41lG3Bal262nbkWY8Vbs4tzndLbQdZGDxXTbLwF4V5O10z3Ig2bix98ckpqlnIsjOF_rV3I72ZkECQmEVpPNcOlVDiLqUKgAgAMSpFeRoJ2Q7Wu3aXutYQV5PzWhvKYkCXei3uU-NkLkuRUGeDToxjUQk-CYUL4ja05a9oe63dIufmdPbaIEnV12Qdzu9uh7W_5fi-c2zeEHu8aTqZcUqfkQONqtteIkYauNeZUP5DSd9G3Y
  priority: 102
  providerName: ProQuest
– databaseName: Springer Nature HAS Fully OA
  dbid: AAJSJ
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1baxQxFA61RfBFvHe0SgTfNDiTyyR5XMVSFvRFC32R4WSSrAtlpmx3Ke2v9yQzO7JaBV8nCcmcS86X23cIeWNc5WvHI4vceSYdVwyUk8wErbUzICBnifj8pT45lfMzdbZH-PYtTL60nykt8zS9vR32_hIDTXoMxgXjurY1u7lDDhJVO9r2wWw2_zqfdlbS2ZWs7PhCphTmlsY7USiT9d-GMP-8KPnbaWkOQscPyP0RPdLZMN6HZC90j8jdIZ_k9WPyHSNPooSg8by_on65SBlB6Ppq2dFFppdOWqAIU2lmKYZrmriccLXc0xTNPMXSFXhYsRbSZhW9AJwRPY2btKf2hJwef_r28YSN-RNYizhszQz3Oral884G9EwEdlFUzkatLEReRStBaqcCyiVUAb1ZtEZJX9W1FS2AEk_Jftd34ZBQcBV3FsAFjUsKI01rXRDB-YiyjqYtSLWVZ9OO5OIpx8V5kw-5hWkGHTTYV5N10NwU5O3U5mKg1vhn7Q9JTVPNRIudP_SrRTOaSQMSEADrIJ3n0mkFIZpS6gCAsA9_rSBHWyU3o69epg6kxYWi0QV5PRWjl6WjE-hCv8l1FAJlVYqCPBtsYhqJSKBNaF4QvWMtO0PdLemWPzKTtzUC16umIO-2dvVrWH8XxfP_q_6C3OPJ9MuKVfyI7K9Xm_ASkdTavRpd5ycWQRsj
  priority: 102
  providerName: Springer Nature
Title Traffic flow digital twin generation for highway scenario based on radar-camera paired fusion
URI https://link.springer.com/article/10.1038/s41598-023-27696-z
https://www.ncbi.nlm.nih.gov/pubmed/36635372
https://www.proquest.com/docview/2764914987
https://www.proquest.com/docview/2765774503
https://pubmed.ncbi.nlm.nih.gov/PMC9837168
https://doaj.org/article/a4a8007e4bd24b75aef8047eaa64454d
Volume 13
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1ba9swFD70wkZfyq6dty5osLfNWyzJlvQwRhpaSqBlbAvkZRjJlrJAcLo0oU1__Y5kOyNbuieDJGP5XHy-I1nfAXgrTVJmhrrYUVPG3NA01qnhsbRCCCM106FKxMVldj7kg1E62oG23FEjwOutqZ2vJzWcTz_c_lp9Rof_VB8Zlx-vMQj5g2KUxVRkKovvdmEfI5PwFQ0uGrhfc31TxRPVnJ3ZfusBPGQ-CjNBN0JVYPTfBkP__Zvyry3VEKnOHsFhAzFJr7aJx7BjqyfwoC46uXoKPzA8ed4I4qazG1JOxr5sCFncTCoyDhzUXlUEsSwJVMZ6RTzhE6bUM-JDXkmwd65LPY8L7Ve0yJXGz2ZJ3NIvvD2D4dnp9_553BRZiAsEa4tY0lK4omtKoyy6L6I_xxKjnEiVdjRximsuTGpRRDax6PKskCkvkyxTrNA6Zc9hr5pV9gUQbRJqlNbGCsw7JJeFMpZZUzoUu5NFBEkrz7xoGMh9IYxpHnbCmcxrdeT4rDyoI7-L4N36nquaf-O_o0-8mtYjPXd2aJjNx3njirnmGlGysNyUlBuRautklwurNWJDfLUIjlsl5609-gdwhdmkFBG8WXejK_r9FV3Z2TKMSRFNp10WwVFtE-uZtDYVgdiwlo2pbvZUk5-B7ltJhkmtjOB9a1d_pnW_KF7eO4VXcEC9wXeTOKHHsLeYL-1rRFYL04FdMRId2O_1Bt8GeD05vfzyFVv7Wb8TVis6waF-A9VAJI4
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELaqIgQXxLuBAkaCE1hNbCe2Dwjxqrb0cWqlvaBgx_ayUpUs-9Bq-6P4jYydZKvl0VuvsaPYnvHMN554PoReSZPZwlBPPDWWcENzonPDiXRCCCM105El4vikGJzxr8N8uIV-9Xdhwm-VvU2Mhto2VTgj36Oi4ArgvBTvJz9JYI0K2dWeQqNVi0O3WkLINnt38Bnk-5rS_S-nnwakYxUgFaCTOZHUCl-lxhrlQF8B7niWGeVFrrSnmVdcc2FyB87MZQ50nFUy5zYrCsUqrQNLBJj8G-B40xDsiaFYn-mErBnPVHc3J2Vybwb-Mdxho4zABFRBLjb8X6QJ-Be2_fsXzT_ytNH97d9Fdzrcij-0inYPbbn6PrrZMlmuHqBv4PNCMQrsz5sltuNR4CLB8-W4xqNY2DrIHwNAxrE-sl7hUEUK4vQGBz9qMbROtdVTUulwTIYnGmyxxX4RTvMeorNrWeNHaLtuareDsDYZNUpr4wQEM5LLShnHnLEe1trLKkFZv55l1ZU1D-wa52VMrzNZtjIo4VtllEF5kaA363cmbVGPK3t_DGJa9wwFueODZjoqu_1daq4BegvHjaXciFw7L1MunNYAOGFqCdrthVx2VmJWXup0gl6um2F_h6SNrl2ziH1ygOh5yhL0uNWJ9UhYgItM0ASJDW3ZGOpmSz3-EWuIK8kgUpYJetvr1eWw_r8UT66exQt0a3B6fFQeHZwcPkW3aVD7NCMZ3UXb8-nCPQP8NjfP46bB6Pt179LfBH5XrQ
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LbxMxELaqVCAuiDcLBYwEJ7Cya3vX3gNClDZqKUQVolIvyLXXdohU7YY8FKU_jV_HeB-pwqO3XmOvYnu-mfnssWcQeiVNYjNDPfHUWMINTYlODSfSCSGM1EzXVSK-DLODE_7pND3dQr-6tzDhWmVnE2tDbasinJH3qch4DnReir5vr0Uc7w3eT36SUEEqRFq7choNRI7cagnbt9m7wz2Q9WtKB_vfPh6QtsIAKYCpzImkVvgiNtbkDrAL1MezxORepLn2NPE511yY1IFjc4kDvLNCptwmWZazQutQMQLM_7YIu6Ie2t7dHx5_XZ_whBgaT_L2pU7MZH8G3jK8aKOMwHTyjFxseMO6aMC_mO7fFzb_iNrWznBwB91uWSz-0MDuLtpy5T10o6lrubqPvoMHDKkpsD-vltiOR6EyCZ4vxyUe1WmuAxow0GVcZ0vWKxxySsGuvcLBq1oMrVNt9ZQUOhya4YkGy2yxX4SzvQfo5FpW-SHqlVXpHiOsTUJNrrVxArY2kssiN445Yz2stZdFhJJuPVXRJjkPtTbOVR1sZ1I1MlDwX6qWgbqI0Jv1N5MmxceVvXeDmNY9Q3ru-odqOlKttivNNRBx4bixlBuRaudlzIXTGugnTC1CO52QVWszZuoS4RF6uW4GbQ8hHF26alH3SYGwpzGL0KMGE-uRsEAemaAREhto2RjqZks5_lFnFM8lg32zjNDbDleXw_r_Ujy5ehYv0E3QUPX5cHj0FN2iAfVxQhK6g3rz6cI9AzI3N89brcHo7LoV9TftT11I
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Traffic+flow+digital+twin+generation+for+highway+scenario+based+on+radar-camera+paired+fusion&rft.jtitle=Scientific+reports&rft.au=Li%2C+Yanbing&rft.au=Zhang%2C+Weichuan&rft.date=2023-01-12&rft.eissn=2045-2322&rft.volume=13&rft.issue=1&rft.spage=642&rft_id=info:doi/10.1038%2Fs41598-023-27696-z&rft_id=info%3Apmid%2F36635372&rft.externalDocID=36635372
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2045-2322&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2045-2322&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2045-2322&client=summon