Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review
With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the pe...
Saved in:
Published in | Sensors (Basel, Switzerland) Vol. 21; no. 6; p. 2140 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI
18.03.2021
MDPI AG |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented. This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors, and radar sensors and the various conditions in which such sensors may operate in practice. We present an overview of the three primary categories of sensor calibration and review existing open-source calibration packages for multi-sensor calibration and their compatibility with numerous commercial sensors. We also summarize the three main approaches to sensor fusion and review current state-of-the-art multi-sensor fusion techniques and algorithms for object detection in autonomous driving applications. The current paper, therefore, provides an end-to-end review of the hardware and software methods required for sensor fusion object detection. We conclude by highlighting some of the challenges in the sensor fusion field and propose possible future research directions for automated driving systems. |
---|---|
AbstractList | With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented. This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors, and radar sensors and the various conditions in which such sensors may operate in practice. We present an overview of the three primary categories of sensor calibration and review existing open-source calibration packages for multi-sensor calibration and their compatibility with numerous commercial sensors. We also summarize the three main approaches to sensor fusion and review current state-of-the-art multi-sensor fusion techniques and algorithms for object detection in autonomous driving applications. The current paper, therefore, provides an end-to-end review of the hardware and software methods required for sensor fusion object detection. We conclude by highlighting some of the challenges in the sensor fusion field and propose possible future research directions for automated driving systems. With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented. This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors, and radar sensors and the various conditions in which such sensors may operate in practice. We present an overview of the three primary categories of sensor calibration and review existing open-source calibration packages for multi-sensor calibration and their compatibility with numerous commercial sensors. We also summarize the three main approaches to sensor fusion and review current state-of-the-art multi-sensor fusion techniques and algorithms for object detection in autonomous driving applications. The current paper, therefore, provides an end-to-end review of the hardware and software methods required for sensor fusion object detection. We conclude by highlighting some of the challenges in the sensor fusion field and propose possible future research directions for automated driving systems.With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented. This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors, and radar sensors and the various conditions in which such sensors may operate in practice. We present an overview of the three primary categories of sensor calibration and review existing open-source calibration packages for multi-sensor calibration and their compatibility with numerous commercial sensors. We also summarize the three main approaches to sensor fusion and review current state-of-the-art multi-sensor fusion techniques and algorithms for object detection in autonomous driving applications. The current paper, therefore, provides an end-to-end review of the hardware and software methods required for sensor fusion object detection. We conclude by highlighting some of the challenges in the sensor fusion field and propose possible future research directions for automated driving systems. |
Author | Yeong, De Jong Velasco-Hernandez, Gustavo Walsh, Joseph Barry, John |
AuthorAffiliation | 2 School of Science Technology, Engineering and Mathematics, Munster Technological University, V92 CX88 Tralee, Ireland 1 IMaR Research Centre, Munster Technological University, V92 CX88 Tralee, Ireland; gustavo.velascohernandez@staff.ittralee.ie (G.V.-H.); john.barry@staff.ittralee.ie (J.B.); joseph.walsh@staff.ittralee.ie (J.W.) 3 Lero—Science Foundation Ireland Research Centre for Software, V92 NYD3 Limerick, Ireland |
AuthorAffiliation_xml | – name: 3 Lero—Science Foundation Ireland Research Centre for Software, V92 NYD3 Limerick, Ireland – name: 2 School of Science Technology, Engineering and Mathematics, Munster Technological University, V92 CX88 Tralee, Ireland – name: 1 IMaR Research Centre, Munster Technological University, V92 CX88 Tralee, Ireland; gustavo.velascohernandez@staff.ittralee.ie (G.V.-H.); john.barry@staff.ittralee.ie (J.B.); joseph.walsh@staff.ittralee.ie (J.W.) |
Author_xml | – sequence: 1 givenname: De Jong orcidid: 0000-0002-4626-8040 surname: Yeong fullname: Yeong, De Jong – sequence: 2 givenname: Gustavo orcidid: 0000-0002-2177-6348 surname: Velasco-Hernandez fullname: Velasco-Hernandez, Gustavo – sequence: 3 givenname: John surname: Barry fullname: Barry, John – sequence: 4 givenname: Joseph orcidid: 0000-0002-6756-3700 surname: Walsh fullname: Walsh, Joseph |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/33803889$$D View this record in MEDLINE/PubMed |
BookMark | eNplkU1P3DAQhq0KVD4P_QNVjuWw3bEdx04PlVaUpUhISC1wtRxnsmuUtamdUPHvMeyC-LjYo_E7zzvyu0e2fPBIyBcK3zmvYZoYhYrREj6RXVqycqIYg61X9Q7ZS-kGgHHO1Weyk0_gStW75Ndf9CnEwvi22JTzMbngi0u0Sx_6sLgvnC9m4xB8WIUxFde4dLbH9KOYFX_wzuH_A7LdmT7h4ebeJ1fzk8vj35Pzi9Oz49n5xAqQwyQvmDdooBKqAzTUYkeFlFxWlLW0ZZ3lIDraCI6qBqqkxaZkspO0xsaYju-TszW3DeZG30a3MvFeB-P0UyPEhTZxeFxOQ1sayi2WmVdaJpWVQlWslMCl4q3IrJ9r1u3YrLC16Ido-jfQty_eLfUi3GkFwBmnGfBtA4jh34hp0CuXLPa98Zi_STMBSlSylixLv772ejF5jiELpmuBjSGliJ22bjBDjiFbu15T0I9B65eg88TRu4ln6EftAzg3pfo |
CitedBy_id | crossref_primary_10_1038_s41467_024_48526_4 crossref_primary_10_1007_s41315_024_00386_3 crossref_primary_10_2478_amns_2023_2_00555 crossref_primary_10_1016_j_trpro_2023_11_278 crossref_primary_10_1109_TRS_2025_3546001 crossref_primary_10_1177_16878132231183232 crossref_primary_10_1016_j_asoc_2024_111802 crossref_primary_10_1007_s10489_022_03576_3 crossref_primary_10_1016_j_jestch_2023_101457 crossref_primary_10_1007_s43154_023_00097_w crossref_primary_10_1109_JSEN_2024_3379500 crossref_primary_10_1007_s42835_023_01567_z crossref_primary_10_1109_ACCESS_2024_3429184 crossref_primary_10_1109_JSEN_2022_3166190 crossref_primary_10_1016_j_iot_2024_101194 crossref_primary_10_1109_ACCESS_2022_3197665 crossref_primary_10_1088_2631_8695_ad9db7 crossref_primary_10_1109_TIM_2024_3472860 crossref_primary_10_1016_j_techfore_2021_121414 crossref_primary_10_1109_JSEN_2024_3368198 crossref_primary_10_1109_TITS_2023_3327949 crossref_primary_10_4018_IJHISI_325220 crossref_primary_10_1007_s13177_024_00401_8 crossref_primary_10_1007_s11229_021_03408_w crossref_primary_10_1016_j_asej_2025_103301 crossref_primary_10_1109_LRA_2023_3241807 crossref_primary_10_1016_j_cemconres_2024_107656 crossref_primary_10_3952_physics_2023_63_4_6 crossref_primary_10_1016_j_trb_2024_102946 crossref_primary_10_1109_TITS_2023_3270887 crossref_primary_10_1515_auto_2022_0157 crossref_primary_10_1109_TITS_2024_3422037 crossref_primary_10_1016_j_ijtst_2025_02_002 crossref_primary_10_1051_epjconf_202430500027 crossref_primary_10_21595_jve_2024_23947 crossref_primary_10_1109_COMST_2023_3295384 crossref_primary_10_47992_IJCSBE_2581_6942_0282 crossref_primary_10_12677_MOS_2023_124303 crossref_primary_10_26636_jtit_2021_155521 crossref_primary_10_1109_TVT_2022_3232135 crossref_primary_10_26599_JICV_2023_9210049 crossref_primary_10_1109_TITS_2024_3432634 crossref_primary_10_1007_s40725_023_00184_3 crossref_primary_10_1080_01431161_2024_2398823 crossref_primary_10_1109_TIM_2022_3200434 crossref_primary_10_1109_TNSRE_2023_3254151 crossref_primary_10_31127_tuje_1366248 crossref_primary_10_1007_s10462_023_10644_8 crossref_primary_10_1016_j_autcon_2024_105337 crossref_primary_10_1109_TVT_2023_3312574 crossref_primary_10_1016_j_asoc_2025_112971 crossref_primary_10_1109_ACCESS_2023_3272750 crossref_primary_10_1109_OJITS_2024_3481328 crossref_primary_10_1016_j_aei_2024_102864 crossref_primary_10_1007_s00521_025_11055_2 crossref_primary_10_1109_ACCESS_2024_3366548 crossref_primary_10_1016_j_geits_2023_100125 crossref_primary_10_1117_1_OE_62_3_031206 crossref_primary_10_1002_aisy_202100122 crossref_primary_10_1109_JIOT_2024_3416255 crossref_primary_10_1109_ACCESS_2022_3228735 crossref_primary_10_3390_s24061724 crossref_primary_10_1109_COMST_2024_3384132 crossref_primary_10_1016_j_treng_2021_100083 crossref_primary_10_1117_1_JEI_31_3_033023 crossref_primary_10_1016_j_isprsjprs_2022_12_021 crossref_primary_10_1016_j_rineng_2025_103995 crossref_primary_10_1108_SRT_05_2024_0005 crossref_primary_10_1109_TITS_2022_3160829 crossref_primary_10_1109_TTE_2024_3383091 crossref_primary_10_1016_j_sftr_2025_100564 crossref_primary_10_1007_s00138_024_01546_y crossref_primary_10_1109_OJVT_2025_3542213 crossref_primary_10_1016_j_trip_2023_100964 crossref_primary_10_1126_scirobotics_ade4698 crossref_primary_10_4271_12_06_04_0026 crossref_primary_10_1109_ACCESS_2024_3430933 crossref_primary_10_35234_fumbd_1385541 crossref_primary_10_1109_TITS_2022_3160932 crossref_primary_10_1109_JSEN_2023_3274194 crossref_primary_10_1109_ACCESS_2022_3188990 crossref_primary_10_1109_JSEN_2023_3300957 crossref_primary_10_1109_TGRS_2023_3328929 crossref_primary_10_1088_1674_4926_23120037 crossref_primary_10_1109_JSEN_2024_3477309 crossref_primary_10_1109_ACCESS_2024_3456893 crossref_primary_10_1016_j_procs_2023_01_058 crossref_primary_10_37661_1816_0301_2024_21_3_48_62 crossref_primary_10_1016_j_robot_2023_104557 crossref_primary_10_1109_JLT_2025_3533911 crossref_primary_10_1109_ACCESS_2022_3145972 crossref_primary_10_1021_acsami_4c00470 crossref_primary_10_1109_TIV_2024_3380244 crossref_primary_10_1002_cae_22705 crossref_primary_10_32604_cmes_2024_056022 crossref_primary_10_1515_teme_2024_0004 crossref_primary_10_1016_j_trpro_2023_11_419 crossref_primary_10_1109_TIV_2023_3271624 crossref_primary_10_1109_TIV_2023_3264658 crossref_primary_10_1007_s12647_021_00500_x crossref_primary_10_17482_uumfd_1392518 crossref_primary_10_1109_JSEN_2024_3481492 crossref_primary_10_1080_23311916_2024_2353498 crossref_primary_10_1145_3627160 crossref_primary_10_1371_journal_pone_0305933 crossref_primary_10_1109_TCCN_2024_3358545 crossref_primary_10_1016_j_adhoc_2023_103101 crossref_primary_10_1109_ACCESS_2022_3211267 crossref_primary_10_1109_JSTQE_2023_3304294 crossref_primary_10_3788_LOP220669 crossref_primary_10_1016_j_vlsi_2025_102370 crossref_primary_10_1109_TCAD_2024_3446709 crossref_primary_10_1109_TED_2022_3164370 crossref_primary_10_1080_0305215X_2023_2283606 crossref_primary_10_1109_JSEN_2025_3527222 crossref_primary_10_1016_j_procs_2024_06_012 crossref_primary_10_1016_j_compag_2024_109229 crossref_primary_10_25046_aj070632 crossref_primary_10_1038_s41598_024_53009_z crossref_primary_10_1016_j_cscm_2025_e04430 crossref_primary_10_1016_j_trc_2024_104868 crossref_primary_10_1109_ACCESS_2024_3364050 crossref_primary_10_54097_hset_v9i_1868 crossref_primary_10_1109_TITS_2023_3266639 crossref_primary_10_1016_j_engappai_2024_108550 crossref_primary_10_1016_j_apm_2022_10_010 crossref_primary_10_2139_ssrn_4098793 crossref_primary_10_1515_auto_2021_0132 crossref_primary_10_1109_ACCESS_2024_3513714 crossref_primary_10_1109_ACCESS_2022_3186020 crossref_primary_10_1587_transele_2021CTI0002 crossref_primary_10_1186_s13634_024_01182_8 crossref_primary_10_1016_j_oceaneng_2022_113155 crossref_primary_10_4271_12_07_02_0010 crossref_primary_10_1109_LRA_2022_3192632 crossref_primary_10_1088_1402_4896_ace38f crossref_primary_10_1016_j_trb_2023_06_001 crossref_primary_10_1016_j_asoc_2024_111873 crossref_primary_10_1109_ACCESS_2024_3522090 crossref_primary_10_1007_s11036_022_02056_9 crossref_primary_10_1109_TIV_2023_3234583 crossref_primary_10_1016_j_hspr_2023_01_001 crossref_primary_10_1007_s11276_023_03356_y crossref_primary_10_1088_1742_6596_2692_1_012007 crossref_primary_10_1109_ACCESS_2023_3276821 crossref_primary_10_1109_TII_2024_3353845 crossref_primary_10_5507_tots_2022_011 crossref_primary_10_1016_j_dsm_2021_12_002 crossref_primary_10_1109_ACCESS_2024_3415381 crossref_primary_10_2197_ipsjjip_29_667 crossref_primary_10_1109_ACCESS_2024_3524501 crossref_primary_10_1109_TAI_2024_3435608 crossref_primary_10_1002_smll_202304001 crossref_primary_10_1021_acsami_4c19091 crossref_primary_10_1109_ACCESS_2024_3431437 crossref_primary_10_1016_j_ress_2023_109102 crossref_primary_10_1515_jag_2022_0022 crossref_primary_10_4271_12_08_01_0005 crossref_primary_10_1007_s41315_024_00418_y crossref_primary_10_1016_j_tbench_2022_100075 crossref_primary_10_1016_j_rtbm_2024_101228 crossref_primary_10_1016_j_measurement_2024_115489 crossref_primary_10_1016_j_measurement_2022_111002 crossref_primary_10_2174_18744478_v16_e2209200 crossref_primary_10_1109_ACCESS_2022_3199691 crossref_primary_10_1007_s12239_024_00112_9 crossref_primary_10_1016_j_aap_2024_107692 crossref_primary_10_1016_j_autcon_2024_105850 crossref_primary_10_1007_s10668_024_04790_4 crossref_primary_10_1109_TITS_2022_3149370 crossref_primary_10_1109_OJSE_2023_3342572 crossref_primary_10_20935_AcadEng7339 crossref_primary_10_1002_ente_202300938 crossref_primary_10_1007_s41062_023_01232_4 crossref_primary_10_1016_j_biosystemseng_2023_06_002 crossref_primary_10_1155_abb_2451501 crossref_primary_10_1002_adc2_101 crossref_primary_10_1016_j_ifacol_2023_10_459 crossref_primary_10_1021_acs_nanolett_4c01187 crossref_primary_10_1109_TITS_2022_3206235 crossref_primary_10_1007_s10489_024_06213_3 crossref_primary_10_4018_IJSWIS_333056 crossref_primary_10_1016_j_aap_2022_106762 crossref_primary_10_1016_j_seta_2023_103579 crossref_primary_10_1016_j_ijdrr_2023_104094 crossref_primary_10_1109_ACCESS_2023_3311136 crossref_primary_10_1007_s12541_025_01244_3 crossref_primary_10_1145_3565973 crossref_primary_10_1016_j_inffus_2023_01_025 crossref_primary_10_1109_LSENS_2024_3357653 crossref_primary_10_1109_TITS_2022_3149763 crossref_primary_10_1364_OE_524531 crossref_primary_10_3233_KES_230036 crossref_primary_10_2139_ssrn_4087061 crossref_primary_10_1109_ACCESS_2024_3361829 crossref_primary_10_1109_JIOT_2024_3409781 crossref_primary_10_7467_KSAE_2024_32_7_583 crossref_primary_10_1109_TTE_2023_3299247 crossref_primary_10_1016_j_enbuild_2023_113020 crossref_primary_10_1002_adfm_202414876 crossref_primary_10_1038_s41598_023_47484_z crossref_primary_10_1109_JSEN_2023_3260193 crossref_primary_10_1109_JSEN_2024_3349963 crossref_primary_10_1177_15501477211039134 crossref_primary_10_1109_ACCESS_2023_3321912 crossref_primary_10_1002_admt_202300316 crossref_primary_10_2174_2210327913666230824145823 crossref_primary_10_1109_JIOT_2022_3194716 crossref_primary_10_1109_OJITS_2023_3279264 crossref_primary_10_1109_TRS_2024_3408231 crossref_primary_10_1016_j_jfca_2025_107403 crossref_primary_10_1016_j_ceramint_2023_12_234 crossref_primary_10_1145_3520134 crossref_primary_10_1016_j_isci_2022_104703 crossref_primary_10_1049_itr2_12310 crossref_primary_10_2139_ssrn_4087711 crossref_primary_10_1016_j_procir_2024_02_026 crossref_primary_10_1109_TIV_2023_3289540 crossref_primary_10_3389_ffutr_2021_759125 crossref_primary_10_4271_10_07_03_0025 crossref_primary_10_1109_COMST_2024_3398004 crossref_primary_10_1109_OJITS_2022_3214094 crossref_primary_10_1080_10095020_2025_2465307 crossref_primary_10_1016_j_techsoc_2023_102288 crossref_primary_10_1038_s41598_024_58627_1 crossref_primary_10_32604_cmc_2025_061749 crossref_primary_10_1016_j_jobe_2024_108901 crossref_primary_10_1109_TIV_2022_3182218 crossref_primary_10_1364_OE_515140 crossref_primary_10_32604_cmc_2024_055575 crossref_primary_10_1364_OE_451907 crossref_primary_10_1109_TIV_2022_3223131 crossref_primary_10_33889_IJMEMS_2023_8_5_054 crossref_primary_10_1016_j_jmsy_2024_04_013 crossref_primary_10_1109_ACCESS_2022_3213843 crossref_primary_10_1109_ACCESS_2025_3545032 crossref_primary_10_1080_10447318_2023_2204274 crossref_primary_10_1016_j_jcis_2024_04_133 crossref_primary_10_62051_ijcsit_v2n3_09 crossref_primary_10_1109_TIM_2022_3232159 crossref_primary_10_1109_TIM_2023_3338722 crossref_primary_10_1109_TSMC_2023_3276218 crossref_primary_10_1109_ACCESS_2023_3322229 crossref_primary_10_2478_pead_2024_0026 |
Cites_doi | 10.1007/978-3-642-28572-1_14 10.1109/SDF.2019.8916629 10.1016/B978-0-12-811453-7.00019-6 10.1109/ICCV.2019.00940 10.3390/s20154220 10.1109/ICRA.2019.8793949 10.1109/ICRA.2019.8794195 10.1109/IV47402.2020.9304681 10.1016/j.eng.2019.12.012 10.1109/ACCESS.2020.2983149 10.3390/s20164463 10.1109/CVPR.2018.00033 10.1109/ACCESS.2019.2962554 10.1364/OE.25.015269 10.1109/IV47402.2020.9304750 10.1109/IVS.2018.8500699 10.3390/rs12121925 10.1109/CVPR.2018.00472 10.1109/IROS45743.2020.9341715 10.1017/CBO9780511811685 10.2478/s11772-014-0190-2 10.1109/ICCP51029.2020.9266268 10.1002/9781119434610 10.1109/ICCV.2019.00772 10.1109/WACV48630.2021.00157 10.12720/jcm.13.1.8-14 10.1109/ITSC.2019.8917011 10.1016/j.patcog.2007.06.012 10.1007/978-3-030-17795-9_10 10.3390/app9204276 10.1109/TITS.2019.2962338 10.1109/TITS.2021.3054625 10.1016/j.robot.2018.11.023 10.1109/ACCESS.2020.2966400 10.1016/B978-0-12-417049-0.00005-5 10.3390/s20113241 10.1109/ICRA40945.2020.9196717 10.3390/s19061474 10.1109/34.888718 10.1109/CVPRW.2018.00190 10.3390/s20133694 10.1109/ITSC.2016.7795565 10.1109/MC.2017.3001256 10.3390/agronomy10111638 10.3390/s100302027 10.1109/TITS.2022.3155228 10.1109/SAS.2019.8706005 10.1109/TITS.2020.3013099 10.1109/ITSC.2019.8917366 10.1109/ACCESS.2020.3010734 10.1364/OE.381176 10.1109/ISSC.2019.8904920 10.3390/s16101668 10.1109/IROS.2010.5650579 10.1007/978-3-030-01057-7_59 10.3390/rs10010072 10.3390/rs2061610 10.1155/2013/704504 10.1109/ITSC.2017.8317829 10.1007/s11036-019-01378-5 10.1109/IVS.2007.4290104 10.1109/ISSC.2018.8585340 10.1109/ECMR.2017.8098688 10.1109/ITSC.2019.8917135 10.1007/978-3-030-58583-9_43 10.1109/IROS40897.2019.8968054 10.1109/ICSPCS47537.2019.9008742 10.1007/978-3-319-46448-0_2 10.1109/PLANS46316.2020.9109873 10.3390/s20113309 10.1109/ISSC49989.2020.9180186 10.3390/s19030648 10.1109/SSRR50563.2020.9292595 10.3390/app9194093 10.1109/ICRA.2019.8794186 10.1016/j.patcog.2020.107332 10.3390/s20082350 10.1109/SBR-LARS-R.2017.8215269 10.3390/s18040957 10.1016/j.promfg.2020.01.134 10.1109/CVPR.2016.91 10.3390/s19071624 10.1002/rob.21697 10.1109/LRA.2019.2921648 10.1631/FITEE.1900518 10.3390/s19204357 10.1109/CVPR.2016.90 |
ContentType | Journal Article |
Copyright | 2021 by the authors. 2021 |
Copyright_xml | – notice: 2021 by the authors. 2021 |
DBID | AAYXX CITATION NPM 7X8 5PM DOA |
DOI | 10.3390/s21062140 |
DatabaseName | CrossRef PubMed MEDLINE - Academic PubMed Central (Full Participant titles) Directory of Open Access Journals (DOAJ) |
DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
DatabaseTitleList | CrossRef MEDLINE - Academic PubMed |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1424-8220 |
ExternalDocumentID | oai_doaj_org_article_0d4a13ce4f1b4c278c758624703783d5 PMC8003231 33803889 10_3390_s21062140 |
Genre | Journal Article Review |
GrantInformation_xml | – fundername: Science Foundation Ireland grantid: 13/RC/2094_P2 |
GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALIPV ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE IAO ITC KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M 3V. ABJCF ARAPS HCIFZ KB. M7S NPM PDBOC 7X8 PPXIY 5PM PJZUB PUEGO |
ID | FETCH-LOGICAL-c507t-140023b0658f0ea1cef157737612d1d2fc305f1b53e890187ceb427f719ebaaf3 |
IEDL.DBID | M48 |
ISSN | 1424-8220 |
IngestDate | Wed Aug 27 00:24:47 EDT 2025 Thu Aug 21 18:20:47 EDT 2025 Fri Jul 11 04:27:38 EDT 2025 Wed Feb 19 02:27:57 EST 2025 Thu Apr 24 22:53:58 EDT 2025 Tue Jul 01 03:56:08 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 6 |
Keywords | autonomous vehicles radar obstacle detection sensor fusion lidar camera calibration perception self-driving cars |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c507t-140023b0658f0ea1cef157737612d1d2fc305f1b53e890187ceb427f719ebaaf3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 ObjectType-Review-3 content type line 23 |
ORCID | 0000-0002-2177-6348 0000-0002-6756-3700 0000-0002-4626-8040 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s21062140 |
PMID | 33803889 |
PQID | 2508567972 |
PQPubID | 23479 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_0d4a13ce4f1b4c278c758624703783d5 pubmedcentral_primary_oai_pubmedcentral_nih_gov_8003231 proquest_miscellaneous_2508567972 pubmed_primary_33803889 crossref_citationtrail_10_3390_s21062140 crossref_primary_10_3390_s21062140 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20210318 |
PublicationDateYYYYMMDD | 2021-03-18 |
PublicationDate_xml | – month: 3 year: 2021 text: 20210318 day: 18 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland |
PublicationTitle | Sensors (Basel, Switzerland) |
PublicationTitleAlternate | Sensors (Basel) |
PublicationYear | 2021 |
Publisher | MDPI MDPI AG |
Publisher_xml | – name: MDPI – name: MDPI AG |
References | ref_94 ref_137 ref_93 ref_136 ref_92 ref_139 ref_91 ref_138 ref_90 Khatib (ref_166) 2014; Volume 79 ref_131 (ref_148) 2019; 114 ref_99 ref_130 ref_98 Glennie (ref_129) 2010; 2 ref_133 ref_97 ref_132 ref_96 ref_135 ref_95 ref_134 ref_125 ref_128 ref_127 Kodors (ref_62) 2017; 5 ref_120 ref_122 ref_121 ref_123 Velas (ref_143) 2014; 2014 Huang (ref_152) 2020; 8 ref_72 ref_159 ref_71 ref_158 ref_70 ref_151 ref_79 ref_150 ref_78 ref_153 ref_77 ref_76 ref_155 ref_75 ref_154 ref_74 ref_157 ref_73 ref_156 ref_160 Kuutti (ref_25) 2021; 22 ref_83 ref_82 ref_147 ref_81 ref_80 ref_149 ref_140 ref_89 ref_88 ref_87 ref_144 ref_86 ref_85 ref_146 ref_84 Joglekar (ref_33) 2011; 2 ref_214 ref_215 ref_210 ref_212 ref_203 ref_202 ref_205 ref_204 ref_207 ref_206 ref_209 ref_208 Yurtsever (ref_213) 2020; 8 Castanedo (ref_24) 2013; 2013 Liu (ref_126) 2017; 25 ref_201 ref_200 ref_115 ref_114 ref_117 ref_116 ref_119 Ren (ref_211) 2020; 6 ref_110 ref_113 ref_112 ref_104 ref_103 ref_106 ref_105 ref_108 ref_107 ref_109 ref_100 ref_102 ref_101 ref_14 ref_13 ref_12 ref_11 ref_10 Wojtanowski (ref_59) 2014; 22 ref_19 ref_18 ref_17 ref_16 ref_15 ref_22 ref_20 ref_29 Wang (ref_21) 2019; 8 ref_28 ref_27 An (ref_145) 2020; 28 Zhang (ref_141) 2000; 22 Knabe (ref_43) 2017; 34 Harapanahalli (ref_38) 2019; 38 ref_50 ref_58 ref_173 ref_57 ref_172 ref_56 ref_175 ref_55 Bouain (ref_118) 2018; 13 ref_174 ref_54 ref_177 ref_53 ref_176 ref_52 ref_179 ref_51 ref_178 Wang (ref_142) 2008; 41 ref_180 ref_182 ref_181 Hu (ref_26) 2020; 21 ref_61 ref_60 ref_169 ref_69 ref_162 ref_68 ref_161 ref_67 ref_164 ref_66 ref_163 ref_65 ref_64 Armingol (ref_124) 2010; 10 ref_165 ref_63 ref_168 ref_167 ref_171 ref_170 ref_36 ref_195 ref_35 Xu (ref_111) 2020; 25 ref_194 ref_34 ref_197 ref_196 ref_32 ref_199 ref_31 ref_198 Jusoh (ref_23) 2020; 8 ref_30 ref_39 ref_37 ref_47 ref_184 ref_46 ref_183 ref_45 ref_186 ref_44 ref_185 ref_188 ref_42 ref_187 ref_41 ref_40 ref_189 ref_1 ref_3 ref_2 ref_191 ref_190 ref_49 ref_193 ref_48 ref_192 ref_9 ref_8 ref_5 ref_4 ref_7 ref_6 |
References_xml | – volume: Volume 79 start-page: 195 year: 2014 ident: ref_166 article-title: A General Framework for Temporal Calibration of Multiple Proprioceptive and Exteroceptive Sensors publication-title: Experiment Robotics doi: 10.1007/978-3-642-28572-1_14 – ident: ref_9 – ident: ref_178 doi: 10.1109/SDF.2019.8916629 – ident: ref_65 – ident: ref_88 – ident: ref_155 – ident: ref_115 doi: 10.1016/B978-0-12-811453-7.00019-6 – ident: ref_108 – ident: ref_132 – ident: ref_42 – ident: ref_53 doi: 10.1109/ICCV.2019.00940 – ident: ref_71 – ident: ref_19 doi: 10.3390/s20154220 – ident: ref_54 doi: 10.1109/ICRA.2019.8793949 – ident: ref_94 – ident: ref_77 – ident: ref_191 doi: 10.1109/ICRA.2019.8794195 – ident: ref_114 – ident: ref_4 – ident: ref_31 – ident: ref_64 doi: 10.1109/IV47402.2020.9304681 – ident: ref_172 – ident: ref_48 – ident: ref_83 – volume: 6 start-page: 346 year: 2020 ident: ref_211 article-title: Adversarial Attacks and Defenses in Deep Learning publication-title: Engineering doi: 10.1016/j.eng.2019.12.012 – volume: 8 start-page: 58443 year: 2020 ident: ref_213 article-title: A Survey of Autonomous Driving: Common Practices and Emerging Technologies publication-title: IEEE Access doi: 10.1109/ACCESS.2020.2983149 – ident: ref_103 – ident: ref_112 doi: 10.3390/s20164463 – ident: ref_193 doi: 10.1109/CVPR.2018.00033 – ident: ref_3 – ident: ref_121 – ident: ref_47 – volume: 8 start-page: 2847 year: 2019 ident: ref_21 article-title: Multi-Sensor Fusion in Automated Driving: A Survey publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2962554 – volume: 25 start-page: 15269 year: 2017 ident: ref_126 article-title: Flexible and accurate camera calibration using grid spherical images publication-title: Opt. Express doi: 10.1364/OE.25.015269 – ident: ref_150 doi: 10.1109/IV47402.2020.9304750 – ident: ref_180 doi: 10.1109/IVS.2018.8500699 – ident: ref_159 doi: 10.3390/rs12121925 – ident: ref_192 doi: 10.1109/CVPR.2018.00472 – ident: ref_137 – ident: ref_179 – ident: ref_89 – ident: ref_173 doi: 10.1109/IROS45743.2020.9341715 – ident: ref_136 doi: 10.1017/CBO9780511811685 – ident: ref_160 – volume: 22 start-page: 183 year: 2014 ident: ref_59 article-title: Comparison of 905nm and 1550nm semiconductor laser rangefinders’ performance deterioration due to adverse environmental conditions publication-title: Opto-Electron. Rev. doi: 10.2478/s11772-014-0190-2 – ident: ref_95 – volume: 2 start-page: 1758 year: 2011 ident: ref_33 article-title: Depth Estimation Using Monocular Camera publication-title: IJCSIT – ident: ref_109 – ident: ref_15 doi: 10.1109/ICCP51029.2020.9266268 – ident: ref_49 – ident: ref_29 doi: 10.1002/9781119434610 – ident: ref_32 – ident: ref_35 doi: 10.1109/ICCV.2019.00772 – ident: ref_201 doi: 10.1109/WACV48630.2021.00157 – volume: 13 start-page: 8 year: 2018 ident: ref_118 article-title: An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks publication-title: J. Commun. doi: 10.12720/jcm.13.1.8-14 – ident: ref_70 doi: 10.1109/ITSC.2019.8917011 – ident: ref_194 – volume: 41 start-page: 607 year: 2008 ident: ref_142 article-title: A new calibration model for lens distortion publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2007.06.012 – ident: ref_61 – ident: ref_104 – ident: ref_198 doi: 10.1007/978-3-030-17795-9_10 – ident: ref_98 – ident: ref_8 – ident: ref_87 – ident: ref_156 – ident: ref_200 doi: 10.3390/app9204276 – ident: ref_110 – ident: ref_72 – ident: ref_93 – ident: ref_162 – volume: 22 start-page: 712 year: 2021 ident: ref_25 article-title: A Survey of Deep Learning Applications to Autonomous Vehicle Control publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2019.2962338 – ident: ref_183 – ident: ref_7 – ident: ref_215 doi: 10.1109/TITS.2021.3054625 – ident: ref_82 – ident: ref_138 – volume: 114 start-page: 217 year: 2019 ident: ref_148 article-title: Extrinsic 6DoF calibration of a radar-LiDAR-camera system enhanced by radar cross section estimates evaluation publication-title: Rob. Auton. Syst. doi: 10.1016/j.robot.2018.11.023 – volume: 8 start-page: 14424 year: 2020 ident: ref_23 article-title: A Systematic Review on Fusion Techniques and Approaches Used in Applications publication-title: IEEE Access doi: 10.1109/ACCESS.2020.2966400 – ident: ref_37 – ident: ref_209 – ident: ref_120 doi: 10.1016/B978-0-12-417049-0.00005-5 – ident: ref_140 doi: 10.3390/s20113241 – ident: ref_176 doi: 10.1109/ICRA40945.2020.9196717 – ident: ref_167 – ident: ref_116 – ident: ref_189 – ident: ref_69 doi: 10.3390/s19061474 – ident: ref_122 – ident: ref_2 – ident: ref_210 – ident: ref_133 – volume: 22 start-page: 1330 year: 2000 ident: ref_141 article-title: A flexible new technique for camera calibration publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/34.888718 – ident: ref_139 doi: 10.1109/CVPRW.2018.00190 – ident: ref_186 doi: 10.3390/s20133694 – ident: ref_60 doi: 10.1109/ITSC.2016.7795565 – ident: ref_99 – ident: ref_117 – ident: ref_203 – ident: ref_74 – ident: ref_80 – ident: ref_100 – ident: ref_208 doi: 10.1109/MC.2017.3001256 – ident: ref_68 – ident: ref_175 – ident: ref_14 doi: 10.3390/agronomy10111638 – ident: ref_39 – volume: 10 start-page: 2027 year: 2010 ident: ref_124 article-title: Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration publication-title: Sensors doi: 10.3390/s100302027 – ident: ref_154 doi: 10.1109/TITS.2022.3155228 – ident: ref_1 – ident: ref_123 – ident: ref_16 doi: 10.1109/SAS.2019.8706005 – ident: ref_169 – ident: ref_13 doi: 10.1109/TITS.2020.3013099 – ident: ref_56 – ident: ref_27 – ident: ref_135 – ident: ref_202 doi: 10.1109/ITSC.2019.8917366 – ident: ref_10 – volume: 8 start-page: 134101 year: 2020 ident: ref_152 article-title: Improvements to Target-Based 3D LiDAR to Camera Calibration publication-title: IEEE Access doi: 10.1109/ACCESS.2020.3010734 – ident: ref_128 – ident: ref_45 – volume: 28 start-page: 2122 year: 2020 ident: ref_145 article-title: Geometric calibration for LiDAR-camera system fusing 3D-2D and 3D-3D point correspondences publication-title: Opt. Express doi: 10.1364/OE.381176 – ident: ref_36 doi: 10.1109/ISSC.2019.8904920 – ident: ref_97 – ident: ref_28 – ident: ref_204 doi: 10.3390/s16101668 – ident: ref_168 doi: 10.1109/IROS.2010.5650579 – ident: ref_55 doi: 10.1007/978-3-030-01057-7_59 – ident: ref_134 – ident: ref_11 – ident: ref_86 – ident: ref_157 – ident: ref_206 doi: 10.3390/rs10010072 – ident: ref_67 – ident: ref_92 – volume: 2 start-page: 1610 year: 2010 ident: ref_129 article-title: Static Calibration and Analysis of the Velodyne HDL-64E S2 for High Accuracy Mobile Scanning publication-title: Remote Sens. doi: 10.3390/rs2061610 – ident: ref_106 – ident: ref_44 – ident: ref_73 – ident: ref_163 – ident: ref_182 – ident: ref_6 – volume: 2013 start-page: 19 year: 2013 ident: ref_24 article-title: A Review of Data Fusion Techniques publication-title: Sci. World J. doi: 10.1155/2013/704504 – ident: ref_161 doi: 10.1109/ITSC.2017.8317829 – ident: ref_75 – ident: ref_50 – ident: ref_81 – volume: 25 start-page: 1496 year: 2020 ident: ref_111 article-title: Road Boundaries Detection based on Modified Occupancy Grid Map Using Millimeter-wave Radar publication-title: Mob. Netw. Appl. doi: 10.1007/s11036-019-01378-5 – ident: ref_199 – ident: ref_174 – ident: ref_205 doi: 10.1109/IVS.2007.4290104 – ident: ref_20 doi: 10.1109/ISSC.2018.8585340 – ident: ref_214 – ident: ref_101 – ident: ref_149 doi: 10.1109/ECMR.2017.8098688 – ident: ref_144 doi: 10.1109/ITSC.2019.8917135 – ident: ref_181 doi: 10.1007/978-3-030-58583-9_43 – volume: 2014 start-page: 135 year: 2014 ident: ref_143 article-title: Calibration of RGB Camera with Velodyne LiDAR publication-title: J. WSCG – ident: ref_177 doi: 10.1109/IROS40897.2019.8968054 – ident: ref_165 – ident: ref_78 – ident: ref_5 – ident: ref_187 doi: 10.1109/ICSPCS47537.2019.9008742 – ident: ref_113 – ident: ref_171 – ident: ref_84 – ident: ref_196 doi: 10.1007/978-3-319-46448-0_2 – ident: ref_90 – volume: 5 start-page: 362 year: 2017 ident: ref_62 article-title: Point Distribution as True Quality of LiDAR Point Cloud publication-title: Balt. J. Mod. Comput. – ident: ref_207 – ident: ref_58 – ident: ref_105 doi: 10.1109/PLANS46316.2020.9109873 – ident: ref_76 doi: 10.3390/s20113309 – ident: ref_22 doi: 10.1109/ISSC49989.2020.9180186 – ident: ref_52 – ident: ref_197 – ident: ref_185 doi: 10.3390/s19030648 – ident: ref_41 – ident: ref_66 doi: 10.1109/SSRR50563.2020.9292595 – ident: ref_63 doi: 10.3390/app9194093 – ident: ref_146 doi: 10.1109/ICRA.2019.8794186 – ident: ref_212 doi: 10.1016/j.patcog.2020.107332 – ident: ref_107 – ident: ref_17 – ident: ref_131 – ident: ref_184 doi: 10.3390/s20082350 – ident: ref_119 – ident: ref_34 – ident: ref_51 doi: 10.1109/SBR-LARS-R.2017.8215269 – ident: ref_190 doi: 10.3390/s18040957 – volume: 38 start-page: 1524 year: 2019 ident: ref_38 article-title: Autonomous Navigation of mobile robots in factory environment publication-title: Procedia Manuf. doi: 10.1016/j.promfg.2020.01.134 – ident: ref_188 doi: 10.1109/CVPR.2016.91 – ident: ref_40 – ident: ref_153 – ident: ref_102 – ident: ref_125 – ident: ref_127 doi: 10.3390/s19071624 – ident: ref_18 – ident: ref_130 – ident: ref_96 – volume: 34 start-page: 1 year: 2017 ident: ref_43 article-title: Team VALOR’s ESCHER: A Novel Electromechanical Biped for the DARPA Robotics Challenge publication-title: J. Field Robot. doi: 10.1002/rob.21697 – ident: ref_79 – ident: ref_164 – ident: ref_46 – ident: ref_12 – ident: ref_85 – ident: ref_170 – ident: ref_151 doi: 10.1109/LRA.2019.2921648 – ident: ref_158 – ident: ref_91 – ident: ref_147 – ident: ref_57 – volume: 21 start-page: 675 year: 2020 ident: ref_26 article-title: A Survey on multi-sensor fusion based obstacle detection for intelligent ground vehicles in off-road environments publication-title: Front. Inform. Technol. Electron. Eng. doi: 10.1631/FITEE.1900518 – ident: ref_30 doi: 10.3390/s19204357 – ident: ref_195 doi: 10.1109/CVPR.2016.90 |
SSID | ssj0023338 |
Score | 2.7165644 |
SecondaryResourceType | review_article |
Snippet | With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 2140 |
SubjectTerms | autonomous vehicles camera lidar perception radar Review self-driving cars |
SummonAdditionalLinks | – databaseName: Directory of Open Access Journals (DOAJ) dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV07T8MwELZQJxgQb8JLBjGwRG1sJ3bYyqOqkGCBom6Rn2ollKKS_n_OSRpSVImFLUosxb6zc99nX75D6DoRxGjh4pClTIUsdiqUqXRhr2cUxAMI2MoTxeeXZDhiT-N43Cr15XPCKnngynDdnmEyotoyFymmCRcaEG5CGMxULqgp1Ush5i3JVE21KDCvSkeIAqnvfgGxSUjkdzha0acU6V-HLH8nSLYizmAHbddQEferLu6iDZvvoa2WgOA-engFFjqbY5kbXF8OFn7_C_9smeNpjvuLwv-8ACwfv9tJmQl3i_u4Ohg4QKPB49v9MKzrIoQa0FsRwjhghMqDB9ezMtLWRTHn8KmIiIkMcRoWMZgrplakvuietooR7niUWiWlo4eok89ye4ww19JooGyUSMVi8JQnGIomkrvYUsECdLO0V6Zr0XBfu-IjA_LgTZs1pg3QVdP0s1LKWNfozhu9aeDFrcsb4PKsdnn2l8sDdLl0WQaLwZ9wyNyCCTPAcyJOeMpJgI4qFzavghnhlW_SAPEV5670ZfVJPp2UgtsAqing4JP_6Pwp2iQ-LcanBIoz1CnmC3sOuKZQF-UU_gbbVfRQ priority: 102 providerName: Directory of Open Access Journals |
Title | Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review |
URI | https://www.ncbi.nlm.nih.gov/pubmed/33803889 https://www.proquest.com/docview/2508567972 https://pubmed.ncbi.nlm.nih.gov/PMC8003231 https://doaj.org/article/0d4a13ce4f1b4c278c758624703783d5 |
Volume | 21 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwEB71cYED4k0KrAziwCWw8SN2kBDaQpcKqRUCFu0tsh2bVqqydLsrwb9nJsmGBu2RSxQlI8We8Xjms51vAF7khlfeRJXKQrpUquhSW9iYjseVw3iAAdsRUDw5zY9n8tNczXdgU2OzU-DVVmhH9aRmy4tXvy5_v0OHf0uIEyH76yuELTlHpLAL-xiQNBUyOJH9ZgIXCMNaUqGh-CAUNYz929LMf09LXgs_09twq8sb2aQ19B3YCfVduHmNTfAefPiKkHSxZLauWHc7XdNiGPu7fs7OazZZr-hPBoT87Hs4a47FvWET1u4S3IfZ9Ojb--O0K5KQekzlVin2A3voKJOI42AzH2KmtMZ5I-NVVvHo0aNj5pQIpqAKfD44yXXUWRGctVE8gL16UYdHwLS3lUf8Jrh1UqHZCG04kVsdVRBGJvByo6_SdwziVMjiokQkQaote9Um8LwX_dnSZmwTOiSl9wLEdN08WCx_lJ3jlONK2kz4ILEP0nNtPCKcnEucqbQRlUrg2cZkJXoGbXfYOqAKS0zujMp1oXkCD1sT9p_CEUE0OEUCemDcQVuGb-rzs4Z9GzNsgUnxwf9o_GO4wemMDJ0PNE9gb7Vch6eY5KzcCHb1XOPVTD-OYP_w6PTzl1GzYDBqBvcf8X3_Dw |
linkProvider | Scholars Portal |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Sensor+and+Sensor+Fusion+Technology+in+Autonomous+Vehicles%3A+A+Review&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=De+Jong+Yeong&rft.au=Gustavo+Velasco-Hernandez&rft.au=John+Barry&rft.au=Joseph+Walsh&rft.date=2021-03-18&rft.pub=MDPI+AG&rft.eissn=1424-8220&rft.volume=21&rft.issue=6&rft.spage=2140&rft_id=info:doi/10.3390%2Fs21062140&rft.externalDBID=DOA&rft.externalDocID=oai_doaj_org_article_0d4a13ce4f1b4c278c758624703783d5 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |