Object detection using YOLO: challenges, architectural successors, datasets and applications
Object detection is one of the predominant and challenging problems in computer vision. Over the decade, with the expeditious evolution of deep learning, researchers have extensively experimented and contributed in the performance enhancement of object detection and related tasks such as object clas...
Saved in:
Published in | Multimedia tools and applications Vol. 82; no. 6; pp. 9243 - 9275 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.03.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Object detection is one of the predominant and challenging problems in computer vision. Over the decade, with the expeditious evolution of deep learning, researchers have extensively experimented and contributed in the performance enhancement of object detection and related tasks such as object classification, localization, and segmentation using underlying deep models. Broadly, object detectors are classified into two categories viz. two stage and single stage object detectors. Two stage detectors mainly focus on selective region proposals strategy via complex architecture; however, single stage detectors focus on all the spatial region proposals for the possible detection of objects via relatively simpler architecture in one shot. Performance of any object detector is evaluated through detection accuracy and inference time. Generally, the detection accuracy of two stage detectors outperforms single stage object detectors. However, the inference time of single stage detectors is better compared to its counterparts. Moreover, with the advent of YOLO (You Only Look Once) and its architectural successors, the detection accuracy is improving significantly and sometime it is better than two stage detectors. YOLOs are adopted in various applications majorly due to their faster inferences rather than considering detection accuracy. As an example, detection accuracies are 63.4 and 70 for YOLO and Fast-RCNN respectively, however, inference time is around 300 times faster in case of YOLO. In this paper, we present a comprehensive review of single stage object detectors specially YOLOs, regression formulation, their architecture advancements, and performance statistics. Moreover, we summarize the comparative illustration between two stage and single stage object detectors, among different versions of YOLOs, applications based on two stage detectors, and different versions of YOLOs along with the future research directions. |
---|---|
AbstractList | Object detection is one of the predominant and challenging problems in computer vision. Over the decade, with the expeditious evolution of deep learning, researchers have extensively experimented and contributed in the performance enhancement of object detection and related tasks such as object classification, localization, and segmentation using underlying deep models. Broadly, object detectors are classified into two categories viz. two stage and single stage object detectors. Two stage detectors mainly focus on selective region proposals strategy via complex architecture; however, single stage detectors focus on all the spatial region proposals for the possible detection of objects via relatively simpler architecture in one shot. Performance of any object detector is evaluated through detection accuracy and inference time. Generally, the detection accuracy of two stage detectors outperforms single stage object detectors. However, the inference time of single stage detectors is better compared to its counterparts. Moreover, with the advent of YOLO (You Only Look Once) and its architectural successors, the detection accuracy is improving significantly and sometime it is better than two stage detectors. YOLOs are adopted in various applications majorly due to their faster inferences rather than considering detection accuracy. As an example, detection accuracies are 63.4 and 70 for YOLO and Fast-RCNN respectively, however, inference time is around 300 times faster in case of YOLO. In this paper, we present a comprehensive review of single stage object detectors specially YOLOs, regression formulation, their architecture advancements, and performance statistics. Moreover, we summarize the comparative illustration between two stage and single stage object detectors, among different versions of YOLOs, applications based on two stage detectors, and different versions of YOLOs along with the future research directions. Object detection is one of the predominant and challenging problems in computer vision. Over the decade, with the expeditious evolution of deep learning, researchers have extensively experimented and contributed in the performance enhancement of object detection and related tasks such as object classification, localization, and segmentation using underlying deep models. Broadly, object detectors are classified into two categories viz. two stage and single stage object detectors. Two stage detectors mainly focus on selective region proposals strategy via complex architecture; however, single stage detectors focus on all the spatial region proposals for the possible detection of objects via relatively simpler architecture in one shot. Performance of any object detector is evaluated through detection accuracy and inference time. Generally, the detection accuracy of two stage detectors outperforms single stage object detectors. However, the inference time of single stage detectors is better compared to its counterparts. Moreover, with the advent of YOLO (You Only Look Once) and its architectural successors, the detection accuracy is improving significantly and sometime it is better than two stage detectors. YOLOs are adopted in various applications majorly due to their faster inferences rather than considering detection accuracy. As an example, detection accuracies are 63.4 and 70 for YOLO and Fast-RCNN respectively, however, inference time is around 300 times faster in case of YOLO. In this paper, we present a comprehensive review of single stage object detectors specially YOLOs, regression formulation, their architecture advancements, and performance statistics. Moreover, we summarize the comparative illustration between two stage and single stage object detectors, among different versions of YOLOs, applications based on two stage detectors, and different versions of YOLOs along with the future research directions.Object detection is one of the predominant and challenging problems in computer vision. Over the decade, with the expeditious evolution of deep learning, researchers have extensively experimented and contributed in the performance enhancement of object detection and related tasks such as object classification, localization, and segmentation using underlying deep models. Broadly, object detectors are classified into two categories viz. two stage and single stage object detectors. Two stage detectors mainly focus on selective region proposals strategy via complex architecture; however, single stage detectors focus on all the spatial region proposals for the possible detection of objects via relatively simpler architecture in one shot. Performance of any object detector is evaluated through detection accuracy and inference time. Generally, the detection accuracy of two stage detectors outperforms single stage object detectors. However, the inference time of single stage detectors is better compared to its counterparts. Moreover, with the advent of YOLO (You Only Look Once) and its architectural successors, the detection accuracy is improving significantly and sometime it is better than two stage detectors. YOLOs are adopted in various applications majorly due to their faster inferences rather than considering detection accuracy. As an example, detection accuracies are 63.4 and 70 for YOLO and Fast-RCNN respectively, however, inference time is around 300 times faster in case of YOLO. In this paper, we present a comprehensive review of single stage object detectors specially YOLOs, regression formulation, their architecture advancements, and performance statistics. Moreover, we summarize the comparative illustration between two stage and single stage object detectors, among different versions of YOLOs, applications based on two stage detectors, and different versions of YOLOs along with the future research directions. |
Author | Anirudh, G. Diwan, Tausif Tembhurne, Jitendra V. |
Author_xml | – sequence: 1 givenname: Tausif surname: Diwan fullname: Diwan, Tausif email: tdiwan@iiitn.ac.in organization: Department of Computer Science & Engineering, Indian Institute of Information Technology – sequence: 2 givenname: G. surname: Anirudh fullname: Anirudh, G. organization: Department of Data science and analytics, Central University of Rajasthan – sequence: 3 givenname: Jitendra V. surname: Tembhurne fullname: Tembhurne, Jitendra V. organization: Department of Computer Science & Engineering, Indian Institute of Information Technology |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35968414$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kU2LFDEQhoOsuB_6BzxIgxcPtuY7aQ8LsvgFA3PRgyCEdHV2JkNPMqa6hfn3ZnbWVfewpwrU81beqvecnKScAiHPGX3DKDVvkTEqeUs5b5nQUrb7R-SMKSNaYzg7qW9haWsUZafkHHFDKdOKyyfkVKhOW8nkGfmx7DcBpmYIUy0xp2bGmFbN9-Vi-a6BtR_HkFYBXze-wDoeoLn4scEZICDmUjuDnzyGCRufhsbvdmMEfxiFT8njaz9ieHZbL8i3jx--Xn1uF8tPX67eL1qQRk7twAbd9bbzUkkrOqmEBiM4oyp444cejA2d1tb0UkHoesUt8wBSSPDKaBAX5PI4dzf32zBASFP16HYlbn3Zu-yj-7-T4tqt8i_XCWWF4XXAq9sBJf-cA05uGxHCOPoU8oyOG8qlVZLTir68h27yXFJdr1LGGs2t7Sr14l9Hd1b-HL4C_AhAyYglXN8hjLpDuu6Yrqvpupt03b6K7D0RxOnm1HWrOD4sFUcp1n9qoOWv7QdUvwGy3LqE |
CitedBy_id | crossref_primary_10_3390_electronics12244924 crossref_primary_10_1007_s12040_024_02327_x crossref_primary_10_1016_j_jrras_2025_101309 crossref_primary_10_1109_ACCESS_2023_3265195 crossref_primary_10_1177_03611981241246263 crossref_primary_10_1016_j_aiia_2025_02_006 crossref_primary_10_1093_iti_liad004 crossref_primary_10_3390_rs16050756 crossref_primary_10_3390_app14031153 crossref_primary_10_1016_j_neucom_2024_127975 crossref_primary_10_1016_j_marpetgeo_2024_106965 crossref_primary_10_1029_2023JD038987 crossref_primary_10_1051_wujns_2024294338 crossref_primary_10_3934_math_20231219 crossref_primary_10_1016_j_compag_2023_108442 crossref_primary_10_3390_drones9010011 crossref_primary_10_54392_irjmt2436 crossref_primary_10_1016_j_jormas_2025_102293 crossref_primary_10_1007_s12145_025_01825_w crossref_primary_10_58769_joinssr_1498561 crossref_primary_10_1016_j_procs_2024_04_131 crossref_primary_10_4108_eetinis_v11i1_4618 crossref_primary_10_1007_s12040_024_02281_8 crossref_primary_10_1016_j_jag_2024_104229 crossref_primary_10_36930_40350117 crossref_primary_10_1016_j_jag_2024_104106 crossref_primary_10_1615_TelecomRadEng_2023048987 crossref_primary_10_3390_electronics12244936 crossref_primary_10_1016_j_vlsi_2024_102287 crossref_primary_10_3390_electronics14010093 crossref_primary_10_1016_j_ecoinf_2024_102543 crossref_primary_10_1007_s11629_023_8128_0 crossref_primary_10_1016_j_prosdent_2023_07_009 crossref_primary_10_1177_01423312241261087 crossref_primary_10_36930_40350115 crossref_primary_10_1007_s12145_024_01358_8 crossref_primary_10_36930_40340406 crossref_primary_10_1016_j_ufug_2025_128695 crossref_primary_10_1016_j_engappai_2024_109351 crossref_primary_10_1109_LGRS_2024_3372600 crossref_primary_10_1007_s11227_024_06611_x crossref_primary_10_1038_s41598_024_78699_3 crossref_primary_10_1016_j_cscm_2023_e02779 crossref_primary_10_3390_app15010327 crossref_primary_10_23939_ujit2024_01_120 crossref_primary_10_3390_electronics13040773 crossref_primary_10_3390_s25061702 crossref_primary_10_3390_agriculture14030490 crossref_primary_10_3390_electronics13091699 crossref_primary_10_1007_s10266_024_00989_z crossref_primary_10_1109_TAES_2023_3308096 crossref_primary_10_1007_s42979_023_02348_1 crossref_primary_10_31548_machinery_2_2024_33 crossref_primary_10_1016_j_culher_2024_05_009 crossref_primary_10_3390_app14145982 crossref_primary_10_1016_j_engappai_2023_107677 crossref_primary_10_1016_j_comcom_2024_05_022 crossref_primary_10_1016_j_ecoinf_2025_103009 crossref_primary_10_3390_plants14050760 crossref_primary_10_3390_electronics13183676 crossref_primary_10_1016_j_displa_2024_102714 crossref_primary_10_1016_j_jneumeth_2024_110256 crossref_primary_10_1109_ACCESS_2023_3315873 crossref_primary_10_3390_diagnostics14131443 crossref_primary_10_1109_ACCESS_2024_3350381 crossref_primary_10_1109_ACCESS_2024_3386826 crossref_primary_10_1007_s44196_023_00235_4 crossref_primary_10_1016_j_knosys_2024_112027 crossref_primary_10_1007_s11517_023_02855_6 crossref_primary_10_3390_app13158816 crossref_primary_10_1016_j_compag_2024_108738 crossref_primary_10_3390_en17143518 crossref_primary_10_3390_su17051902 crossref_primary_10_1007_s00521_025_11153_1 crossref_primary_10_3390_app14031003 crossref_primary_10_1038_s41598_023_43277_6 crossref_primary_10_3233_SCS_230007 crossref_primary_10_3389_fphy_2024_1398678 crossref_primary_10_3390_rs16173338 crossref_primary_10_1016_j_cosrev_2024_100690 crossref_primary_10_3390_agriculture14030485 crossref_primary_10_3390_s23208471 crossref_primary_10_1007_s00521_024_10246_7 crossref_primary_10_1109_ACCESS_2024_3425166 crossref_primary_10_2298_CSIS241110011Y crossref_primary_10_55905_cuadv16n10_133 crossref_primary_10_1088_2631_8695_ad507d crossref_primary_10_1021_acsagscitech_4c00633 crossref_primary_10_32604_cmc_2024_058409 crossref_primary_10_1007_s11250_024_04050_7 crossref_primary_10_1109_ACCESS_2025_3526458 crossref_primary_10_1016_j_jrras_2024_101281 crossref_primary_10_1016_j_jafr_2025_101734 crossref_primary_10_3390_s24175652 crossref_primary_10_1111_nph_19857 crossref_primary_10_3390_app13158927 crossref_primary_10_1016_j_jormas_2024_102124 crossref_primary_10_1016_j_compag_2024_108612 crossref_primary_10_3390_app13137689 crossref_primary_10_1016_j_optlastec_2024_110549 crossref_primary_10_3390_app13063812 crossref_primary_10_1017_aog_2024_20 crossref_primary_10_1109_ACCESS_2024_3517711 crossref_primary_10_3390_s25010214 crossref_primary_10_48084_etasr_8592 crossref_primary_10_1038_s41598_025_86239_w crossref_primary_10_1115_1_4067980 crossref_primary_10_3389_fpls_2024_1508258 crossref_primary_10_3390_s24103036 crossref_primary_10_3390_solar4030016 crossref_primary_10_1016_j_rineng_2024_103482 crossref_primary_10_1007_s12518_025_00616_8 crossref_primary_10_3390_electronics12183760 crossref_primary_10_35970_jinita_v6i2_2446 crossref_primary_10_3390_agronomy15030582 crossref_primary_10_3390_electronics13214169 crossref_primary_10_3390_s23094381 crossref_primary_10_1109_ACCESS_2024_3513239 crossref_primary_10_1007_s10639_023_12350_y crossref_primary_10_1145_3681796 crossref_primary_10_33769_aupse_1378578 crossref_primary_10_1038_s41598_025_89124_8 crossref_primary_10_3934_mbe_2024118 crossref_primary_10_24003_emitter_v11i2_832 crossref_primary_10_1016_j_jnca_2025_104134 crossref_primary_10_1038_s41598_023_43236_1 crossref_primary_10_1016_j_bspc_2024_106149 crossref_primary_10_3390_app14135559 crossref_primary_10_1016_j_dsp_2024_104594 crossref_primary_10_1109_ACCESS_2024_3373536 crossref_primary_10_3390_s24123791 crossref_primary_10_3390_app132011548 crossref_primary_10_3390_s24206609 crossref_primary_10_1109_ACCESS_2024_3463391 crossref_primary_10_1007_s11554_024_01439_3 crossref_primary_10_1007_s44196_023_00302_w crossref_primary_10_1002_adem_202300956 crossref_primary_10_1016_j_compag_2025_110172 crossref_primary_10_1007_s11760_024_03018_2 crossref_primary_10_1016_j_compag_2024_109201 crossref_primary_10_1016_j_compbiomed_2023_107895 crossref_primary_10_1007_s11517_024_03202_z crossref_primary_10_32604_cmc_2024_048998 crossref_primary_10_1007_s11042_024_19823_3 crossref_primary_10_3390_biology11121841 crossref_primary_10_1186_s13037_024_00406_y crossref_primary_10_3390_info14120655 crossref_primary_10_1007_s12229_024_09299_z crossref_primary_10_3390_futuretransp5010002 crossref_primary_10_1093_bib_bbad531 crossref_primary_10_3390_make5040083 crossref_primary_10_1016_j_measen_2024_101025 crossref_primary_10_23947_2687_1653_2023_23_3_317_328 crossref_primary_10_3390_app14146352 crossref_primary_10_3390_pr11092751 crossref_primary_10_1016_j_engappai_2024_108494 crossref_primary_10_1061_JCEMD4_COENG_15310 crossref_primary_10_3934_era_2023362 crossref_primary_10_1016_j_compag_2025_110183 crossref_primary_10_1016_j_procir_2025_01_023 crossref_primary_10_1007_s10791_025_09513_5 crossref_primary_10_3390_medicina60060972 crossref_primary_10_3390_safety10010026 crossref_primary_10_1016_j_dib_2023_109708 crossref_primary_10_1051_e3sconf_202560801007 crossref_primary_10_1007_s12293_025_00440_y crossref_primary_10_3390_s24186050 crossref_primary_10_3390_eng5040172 crossref_primary_10_56038_ejrnd_v4i4_595 crossref_primary_10_1016_j_artmed_2024_102883 crossref_primary_10_1016_j_asej_2024_103227 crossref_primary_10_1016_j_jafr_2025_101665 crossref_primary_10_1007_s11554_024_01505_w crossref_primary_10_1016_j_measurement_2024_115847 crossref_primary_10_1109_ACCESS_2024_3446613 crossref_primary_10_1016_j_procs_2024_10_095 crossref_primary_10_1109_ACCESS_2024_3469951 crossref_primary_10_1016_j_procir_2024_10_137 crossref_primary_10_3390_healthcare12232330 crossref_primary_10_1016_j_autcon_2024_105925 crossref_primary_10_3389_fpls_2023_1237695 crossref_primary_10_3390_pr11082266 crossref_primary_10_1109_TII_2024_3431044 crossref_primary_10_1016_j_robot_2024_104861 crossref_primary_10_22389_0016_7126_2024_1013_11_25_34 crossref_primary_10_1109_JIOT_2024_3360715 crossref_primary_10_4081_jae_2024_1641 crossref_primary_10_1109_ACCESS_2024_3357519 crossref_primary_10_1117_1_JEI_33_2_023046 crossref_primary_10_1016_j_jenvman_2024_122742 crossref_primary_10_1007_s12524_024_01909_y crossref_primary_10_1016_j_imavis_2024_105307 crossref_primary_10_1016_j_atech_2024_100730 crossref_primary_10_3390_agronomy13122861 crossref_primary_10_3390_app14020539 crossref_primary_10_1016_j_measurement_2023_113936 crossref_primary_10_1080_09544828_2024_2360852 crossref_primary_10_3390_app13158678 crossref_primary_10_1016_j_enbuild_2024_115255 crossref_primary_10_1038_s41598_024_69701_z crossref_primary_10_1038_s41598_025_93096_0 crossref_primary_10_3390_app14188173 crossref_primary_10_3390_rs15163992 crossref_primary_10_3390_agronomy14030618 crossref_primary_10_1016_j_compag_2023_108168 crossref_primary_10_1016_j_knosys_2024_112204 crossref_primary_10_1088_1742_6596_2596_1_012022 crossref_primary_10_3233_IDA_220449 crossref_primary_10_1109_ACCESS_2023_3332032 crossref_primary_10_3390_computers13120336 crossref_primary_10_1186_s13007_024_01244_w crossref_primary_10_1016_j_jksuci_2024_102191 crossref_primary_10_1016_j_atech_2024_100628 crossref_primary_10_3390_ai6030061 crossref_primary_10_3390_math12020297 crossref_primary_10_48084_etasr_7879 crossref_primary_10_3390_solar5010006 crossref_primary_10_1007_s10845_024_02411_5 crossref_primary_10_1007_s11042_025_20743_z crossref_primary_10_1016_j_engappai_2024_108536 crossref_primary_10_1016_j_autcon_2025_106068 crossref_primary_10_3390_electronics13050850 crossref_primary_10_1016_j_engappai_2024_109506 crossref_primary_10_3390_app14051850 crossref_primary_10_1007_s42979_023_02514_5 crossref_primary_10_3390_electronics12224589 crossref_primary_10_3390_su152215714 crossref_primary_10_3390_math12040558 crossref_primary_10_1016_j_compag_2024_108792 crossref_primary_10_1002_itl2_565 crossref_primary_10_3390_electronics13153058 crossref_primary_10_1016_j_compag_2024_109403 crossref_primary_10_35234_fumbd_1393959 crossref_primary_10_1109_ACCESS_2024_3378580 crossref_primary_10_1007_s44196_023_00390_8 crossref_primary_10_3390_jmse12050697 crossref_primary_10_3390_pr10112274 crossref_primary_10_1093_ijlct_ctad122 crossref_primary_10_32604_csse_2023_040475 crossref_primary_10_1051_bioconf_202414101025 crossref_primary_10_1007_s11554_024_01457_1 crossref_primary_10_3390_agriengineering6040204 crossref_primary_10_3390_fishes8100514 crossref_primary_10_7717_peerj_cs_1416 crossref_primary_10_1016_j_ecoinf_2024_102913 crossref_primary_10_1080_17538947_2024_2413889 crossref_primary_10_28925_2663_4023_2024_25_410433 crossref_primary_10_3390_app142210754 crossref_primary_10_1016_j_bioflm_2024_100240 crossref_primary_10_1016_j_ibmed_2025_100200 crossref_primary_10_37394_232015_2024_20_96 crossref_primary_10_2478_amns_2024_0029 crossref_primary_10_3390_electronics13234837 crossref_primary_10_1016_j_heliyon_2024_e38865 crossref_primary_10_1016_j_compag_2024_109317 crossref_primary_10_1109_ACCESS_2023_3321290 crossref_primary_10_21595_jme_2024_24408 crossref_primary_10_1109_ACCESS_2025_3548108 crossref_primary_10_12677_CSA_2023_1311212 crossref_primary_10_3390_rs16142598 crossref_primary_10_1007_s11042_024_19966_3 crossref_primary_10_1109_ACCESS_2025_3535151 crossref_primary_10_1007_s10489_024_05325_0 crossref_primary_10_3390_horticulturae9111213 crossref_primary_10_3390_app15020737 crossref_primary_10_1002_lary_31175 crossref_primary_10_1109_ACCESS_2025_3539081 crossref_primary_10_1016_j_aei_2024_102583 crossref_primary_10_1080_13467581_2023_2287211 crossref_primary_10_1109_JSTARS_2024_3497576 crossref_primary_10_1117_1_JEI_34_1_013007 crossref_primary_10_1016_j_microc_2024_111780 crossref_primary_10_3389_fceng_2024_1415453 crossref_primary_10_3390_rs16193697 crossref_primary_10_1007_s10278_023_00845_6 crossref_primary_10_1016_j_bios_2024_117074 crossref_primary_10_1109_ACCESS_2024_3496823 crossref_primary_10_1177_02841851241251446 crossref_primary_10_1016_j_compag_2024_109669 crossref_primary_10_3390_app14041398 crossref_primary_10_3390_diagnostics13061068 crossref_primary_10_1007_s10072_024_07641_2 crossref_primary_10_1109_ACCESS_2024_3512783 crossref_primary_10_1007_s11042_023_17817_1 crossref_primary_10_3390_app14135841 crossref_primary_10_3390_electronics13010236 crossref_primary_10_3390_jmse12101828 crossref_primary_10_1155_2023_9446956 crossref_primary_10_1109_JIOT_2024_3367415 crossref_primary_10_1016_j_sasc_2024_200140 crossref_primary_10_3390_app13127320 crossref_primary_10_3390_jimaging10080186 crossref_primary_10_3390_app15020947 crossref_primary_10_3233_JIFS_233440 crossref_primary_10_1109_ACCESS_2024_3362230 crossref_primary_10_3390_app131810170 crossref_primary_10_1016_j_eswa_2023_122205 crossref_primary_10_3390_app13020935 crossref_primary_10_1016_j_engappai_2025_110302 crossref_primary_10_56294_sctconf2024859 crossref_primary_10_1080_21642583_2023_2185916 crossref_primary_10_1007_s11760_023_02755_0 crossref_primary_10_1016_j_resconrec_2025_108218 crossref_primary_10_1109_TCSVT_2023_3326279 crossref_primary_10_3389_fpls_2022_1041514 crossref_primary_10_3390_jimaging9070131 crossref_primary_10_3390_electronics13050814 crossref_primary_10_1093_biomethods_bpae056 crossref_primary_10_1109_ACCESS_2024_3404623 crossref_primary_10_3390_s25010065 crossref_primary_10_36548_jismac_2023_4_005 crossref_primary_10_1109_TGRS_2023_3295802 crossref_primary_10_1007_s11042_024_19087_x crossref_primary_10_1016_j_heliyon_2024_e31029 crossref_primary_10_3390_app142411829 crossref_primary_10_1007_s11042_024_19597_8 crossref_primary_10_1016_j_neucom_2024_127387 crossref_primary_10_3390_math11183839 crossref_primary_10_3390_s23104793 crossref_primary_10_1016_j_aquaculture_2024_741252 crossref_primary_10_34133_plantphenomics_0246 crossref_primary_10_1109_ACCESS_2023_3320949 crossref_primary_10_1016_j_compag_2024_108997 crossref_primary_10_1515_jisys_2023_0208 crossref_primary_10_1016_j_indcrop_2024_120241 crossref_primary_10_1016_j_artd_2024_101439 crossref_primary_10_1016_j_autcon_2025_106139 crossref_primary_10_3390_s24206697 crossref_primary_10_1007_s42979_025_03689_9 crossref_primary_10_3390_agriculture15030305 crossref_primary_10_1016_j_eng_2024_11_028 crossref_primary_10_1080_17538947_2024_2392851 crossref_primary_10_1007_s11760_025_03960_9 crossref_primary_10_1007_s13349_025_00921_1 crossref_primary_10_1038_s41598_024_78571_4 crossref_primary_10_1017_eds_2023_8 crossref_primary_10_48175_IJARSCT_18483 crossref_primary_10_1016_j_compag_2024_109871 crossref_primary_10_1016_j_egyr_2024_11_033 crossref_primary_10_3390_jmse10101503 crossref_primary_10_3389_fnbot_2024_1427786 crossref_primary_10_3390_fire7120443 crossref_primary_10_1016_j_imavis_2023_104884 crossref_primary_10_35784_iapgos_6056 crossref_primary_10_1007_s44196_024_00655_w crossref_primary_10_1109_ACCESS_2024_3522240 crossref_primary_10_21869_2223_1536_2024_14_4_28_46 crossref_primary_10_3390_agriculture13122253 crossref_primary_10_3390_buildings14051220 crossref_primary_10_1007_s11554_024_01440_w crossref_primary_10_1016_j_compag_2025_110117 crossref_primary_10_1007_s42423_024_00167_x crossref_primary_10_1080_21642583_2025_2467083 crossref_primary_10_3390_s24237621 crossref_primary_10_48084_etasr_7530 crossref_primary_10_1002_rse2_352 crossref_primary_10_1016_j_eja_2024_127439 crossref_primary_10_1007_s11042_023_16770_3 crossref_primary_10_1016_j_aei_2024_102388 crossref_primary_10_1109_JPHOT_2024_3385182 crossref_primary_10_3390_electronics14020397 crossref_primary_10_1007_s42452_024_06443_7 crossref_primary_10_48130_fia_0025_0007 crossref_primary_10_1016_j_oceaneng_2025_120471 crossref_primary_10_3390_app14177454 crossref_primary_10_3390_diagnostics14242875 crossref_primary_10_3390_ijgi13120423 crossref_primary_10_3390_electronics14030505 crossref_primary_10_1016_j_eswa_2024_124594 crossref_primary_10_1016_j_osnem_2025_100312 crossref_primary_10_3390_bioengineering11100993 crossref_primary_10_3390_electronics13132473 |
Cites_doi | 10.1109/CVPR.2015.7298958 10.1093/nar/gkw226 10.1023/B:VISI.0000013087.49260.fb 10.48550/arXiv.1612.08242 10.48550/arXiv.1312.4400 10.1109/CyberC.2018.00036 10.1109/SDPC.2018.8664773 10.48550/arXiv.1405.0312 10.1109/CVPR.2016.91 10.1109/DASA51403.2020.9317198 10.1109/BigData.2018.8621865 10.21437/Interspeech.2015-350 10.3390/s19153371 10.48550/arXiv.1809.03193 10.1109/ACCESS.2019.2941547 10.1016/j.eswa.2014.12.003 10.1109/ES.2017.35 10.48550/arXiv.1809.02165 10.1109/TCSVT.2020.2986402 10.3390/app7070730 10.1016/j.eswa.2020.113833 10.1007/s42835-019-00230-w 10.48550/arXiv.2107.04191 10.3390/rs11091117 10.1109/CVPRW50498.2020.00203 10.1007/s40747-021-00324-x 10.3390/rs13101909 10.1145/3446132.3446400 10.1109/ACCESS.2020.2979164 10.1016/j.comnet.2020.107138 10.3390/e19060242 10.1016/B978-0-12-816718-2.00013-0 10.1007/s11042-018-6428-0 10.1051/matecconf/202133603002 10.1007/s12206-019-0339-5 10.1109/CVPR.2016.90 10.1155/2018/7075814 10.1016/j.compag.2020.105742 10.3390/make1030044 10.1016/j.scs.2020.102589 10.1109/TPAMI.2017.2695539 10.1609/aaai.v29i1.9513 10.1007/s11554-020-00987-8 10.1109/TENCON.1999.818681 10.1016/j.patcog.2017.10.013 10.1109/TCSVT.2018.2867286 10.1109/CVPR.2008.4587597 10.1038/s41529-017-0021-2 10.1038/s41598-018-24271-9 10.1109/ACCESS.2019.2939201 10.48550/arXiv.1708.02002 10.1007/s11263-019-01247-4 10.1049/cje.2016.07.002 10.1007/s10462-018-9633-3 10.1145/3219819.3219861 10.48550/arXiv.1512.02325 10.3390/en10030406 10.1007/s11263-014-0733-5 10.1109/CVPR.2014.81 10.48550/arXiv.1409.4842 10.1109/ICCV.2015.169 10.1016/j.scs.2020.102600 10.1016/j.procs.2017.06.037 10.1109/SmartGridComm.2018.8587554 10.48550/arXiv.1409.1556 10.3390/app9183750 10.1109/CVPR.2017.106 10.1109/PlatCon.2016.7456805 10.1016/j.jksuci.2019.09.012 |
ContentType | Journal Article |
Copyright | The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022 Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022, Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Copyright Springer Nature B.V. Mar 2023 |
Copyright_xml | – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022 Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022, Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. – notice: Copyright Springer Nature B.V. Mar 2023 |
DBID | AAYXX CITATION NPM 3V. 7SC 7WY 7WZ 7XB 87Z 8AL 8AO 8FD 8FE 8FG 8FK 8FL 8G5 ABUWG AFKRA ARAPS AZQEC BENPR BEZIV BGLVJ CCPQU DWQXO FRNLG F~G GNUQQ GUQSH HCIFZ JQ2 K60 K6~ K7- L.- L7M L~C L~D M0C M0N M2O MBDVC P5Z P62 PHGZM PHGZT PKEHL PQBIZ PQBZA PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM |
DOI | 10.1007/s11042-022-13644-y |
DatabaseName | CrossRef PubMed ProQuest Central (Corporate) Computer and Information Systems Abstracts ABI/INFORM Collection ABI/INFORM Global (PDF only) ProQuest Central (purchase pre-March 2016) ABI/INFORM Collection Computing Database (Alumni Edition) ProQuest Pharma Collection Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Central (Alumni) (purchase pre-March 2016) ABI/INFORM Collection (Alumni) Research Library (Alumni) ProQuest Central ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Central Business Premium Collection Technology Collection ProQuest One ProQuest Central Business Premium Collection (Alumni) ABI/INFORM Global (Corporate) ProQuest Central Student ProQuest Research Library SciTech Premium Collection ProQuest Computer Science Collection ProQuest Business Collection (Alumni Edition) ProQuest Business Collection Computer Science Database ABI/INFORM Professional Advanced Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional ABI/INFORM Global Computing Database ProQuest research library Research Library (Corporate) ProQuest advanced technologies & aerospace journals ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic (New) ProQuest One Academic Middle East (New) ProQuest One Business ProQuest One Business (Alumni) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) |
DatabaseTitle | CrossRef PubMed ABI/INFORM Global (Corporate) ProQuest Business Collection (Alumni Edition) ProQuest One Business Research Library Prep Computer Science Database ProQuest Central Student Technology Collection Technology Research Database Computer and Information Systems Abstracts – Academic ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection Computer and Information Systems Abstracts ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College Research Library (Alumni Edition) ProQuest Pharma Collection ProQuest Central China ABI/INFORM Complete ProQuest Central ABI/INFORM Professional Advanced ProQuest One Applied & Life Sciences ProQuest Central Korea ProQuest Research Library ProQuest Central (New) Advanced Technologies Database with Aerospace ABI/INFORM Complete (Alumni Edition) Advanced Technologies & Aerospace Collection Business Premium Collection ABI/INFORM Global ProQuest Computing ABI/INFORM Global (Alumni Edition) ProQuest Central Basic ProQuest Computing (Alumni Edition) ProQuest One Academic Eastern Edition ProQuest Technology Collection ProQuest SciTech Collection ProQuest Business Collection Computer and Information Systems Abstracts Professional Advanced Technologies & Aerospace Database ProQuest One Academic UKI Edition ProQuest One Business (Alumni) ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) Business Premium Collection (Alumni) MEDLINE - Academic |
DatabaseTitleList | ABI/INFORM Global (Corporate) PubMed MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Computer Science Architecture |
EISSN | 1573-7721 |
EndPage | 9275 |
ExternalDocumentID | PMC9358372 35968414 10_1007_s11042_022_13644_y |
Genre | Journal Article |
GroupedDBID | -Y2 -~C .4S .86 .DC .VR 06D 0R~ 0VY 123 1N0 1SB 2.D 203 28- 29M 2J2 2JN 2JY 2KG 2LR 2P1 2VQ 2~H 30V 3EH 4.4 406 408 409 40D 40E 5QI 5VS 67Z 6NX 7WY 8AO 8FE 8FG 8FL 8G5 8UJ 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAOBN AAPKM AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN AAYTO AAYZH ABAKF ABBBX ABBRH ABBXA ABDBE ABDZT ABECU ABFTV ABHLI ABHQN ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABUWG ABWNU ABXPI ACAOD ACBXY ACDTI ACGFO ACGFS ACHSB ACHXU ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACREN ACSNA ACZOJ ADHHG ADHIR ADHKG ADIMF ADKFA ADKNI ADKPE ADMLS ADRFC ADTPH ADURQ ADYFF ADYOE ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFDZB AFEXP AFGCZ AFKRA AFLOW AFQWF AFWTZ AFYQB AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGQPQ AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHPBZ AHSBF AHYZX AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMTXH AMXSW AMYLF AMYQR AOCGG ARAPS ARCSS ARMRJ ASPBG AVWKF AXYYD AYFIA AYJHY AZFZN AZQEC B-. BA0 BBWZM BDATZ BENPR BEZIV BGLVJ BGNMA BPHCQ BSONS CAG CCPQU COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP DU5 DWQXO EBLON EBS EIOEI EJD ESBYG FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRNLG FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNUQQ GNWQR GQ7 GQ8 GUQSH GXS H13 HCIFZ HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I-F I09 IHE IJ- IKXTQ ITG ITH ITM IWAJR IXC IXE IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ K60 K6V K6~ K7- KDC KOV KOW LAK LLZTM M0C M2O M4Y MA- N2Q N9A NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM OVD P19 P2P P62 P9O PF0 PHGZT PQBIZ PQBZA PQQKQ PROAC PT4 PT5 Q2X QOK QOS R4E R89 R9I RHV RNI RNS ROL RPX RSV RZC RZE RZK S16 S1Z S26 S27 S28 S3B SAP SCJ SCLPG SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TEORI TH9 TSG TSK TSV TUC TUS U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR Z45 ZMTXR ~EX AAYXX ABFSG ABRTQ ACSTC AEZWR AFHIU AFOHR AHWEU AIXLP ATHPR CITATION PHGZM PQGLB NPM 3V. 7SC 7XB 8AL 8FD 8FK JQ2 L.- L7M L~C L~D M0N MBDVC PKEHL PQEST PQUKI PRINS Q9U 7X8 ACMFV 5PM |
ID | FETCH-LOGICAL-c474t-d1d69b89a4548394536c732105ea7adbc78e96687b45ce9b5281acc434ca576c3 |
IEDL.DBID | U2A |
ISSN | 1380-7501 |
IngestDate | Thu Aug 21 18:02:41 EDT 2025 Fri Jul 11 06:40:31 EDT 2025 Fri Jul 25 20:53:54 EDT 2025 Thu Apr 03 07:03:35 EDT 2025 Thu Apr 24 23:01:02 EDT 2025 Tue Aug 05 12:11:52 EDT 2025 Thu Apr 10 07:12:23 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 6 |
Keywords | YOLO Deep learning Computer vision Convolutional neural networks Object detection |
Language | English |
License | The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022, Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c474t-d1d69b89a4548394536c732105ea7adbc78e96687b45ce9b5281acc434ca576c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
OpenAccessLink | https://pubmed.ncbi.nlm.nih.gov/PMC9358372 |
PMID | 35968414 |
PQID | 2778762889 |
PQPubID | 54626 |
PageCount | 33 |
ParticipantIDs | pubmedcentral_primary_oai_pubmedcentral_nih_gov_9358372 proquest_miscellaneous_2702485420 proquest_journals_2778762889 pubmed_primary_35968414 crossref_primary_10_1007_s11042_022_13644_y crossref_citationtrail_10_1007_s11042_022_13644_y springer_journals_10_1007_s11042_022_13644_y |
PublicationCentury | 2000 |
PublicationDate | 2023-03-01 |
PublicationDateYYYYMMDD | 2023-03-01 |
PublicationDate_xml | – month: 03 year: 2023 text: 2023-03-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York – name: United States – name: Dordrecht |
PublicationSubtitle | An International Journal |
PublicationTitle | Multimedia tools and applications |
PublicationTitleAbbrev | Multimed Tools Appl |
PublicationTitleAlternate | Multimed Tools Appl |
PublicationYear | 2023 |
Publisher | Springer US Springer Nature B.V |
Publisher_xml | – name: Springer US – name: Springer Nature B.V |
References | 13644_CR29 13644_CR33 L Liu (13644_CR44) 2020; 28 13644_CR34 AM Rather (13644_CR53) 2015; 42 13644_CR78 13644_CR31 13644_CR75 13644_CR32 13644_CR76 13644_CR37 13644_CR38 13644_CR35 13644_CR79 13644_CR70 S Hossain (13644_CR26) 2019; 19 13644_CR73 13644_CR30 13644_CR72 H Wei (13644_CR74) 2019; 1 J Jiang (13644_CR28) 2021; 13 13644_CR39 A Rastogi (13644_CR52) 2019; 33 Y Zhang (13644_CR82) 2016; 25 13644_CR45 J Li (13644_CR36) 2019; 9 13644_CR42 13644_CR43 QC Mao (13644_CR47) 2019; 7 13644_CR87 13644_CR48 13644_CR49 H Zhang (13644_CR81) 2019; 11 13644_CR40 13644_CR84 13644_CR41 13644_CR85 A Voulodimos (13644_CR71) 2018; 2018 W Nash (13644_CR50) 2018; 2 13644_CR11 13644_CR55 13644_CR12 13644_CR56 13644_CR10 13644_CR54 13644_CR15 13644_CR59 J Xiang (13644_CR77) 2020; 8 13644_CR16 Z Che (13644_CR8) 2018; 8 13644_CR13 13644_CR57 13644_CR58 XY Zhang (13644_CR83) 2017; 40 13644_CR51 S Albelwi (13644_CR2) 2017; 19 13644_CR19 13644_CR17 13644_CR18 MA Zaytar (13644_CR80) 2016; 143 M Everingham (13644_CR14) 2015; 111 13644_CR4 13644_CR22 13644_CR5 13644_CR23 13644_CR67 13644_CR20 13644_CR64 13644_CR3 13644_CR21 13644_CR65 13644_CR9 13644_CR27 LH Thai (13644_CR66) 2012; 4 13644_CR6 13644_CR24 13644_CR68 13644_CR7 13644_CR25 M Loey (13644_CR46) 2021; 65 13644_CR69 13644_CR62 13644_CR1 13644_CR63 13644_CR60 13644_CR61 Q Zhao (13644_CR86) 2019; 33 |
References_xml | – ident: 13644_CR37 doi: 10.1109/CVPR.2015.7298958 – ident: 13644_CR51 doi: 10.1093/nar/gkw226 – ident: 13644_CR70 doi: 10.1023/B:VISI.0000013087.49260.fb – ident: 13644_CR54 doi: 10.48550/arXiv.1612.08242 – ident: 13644_CR39 doi: 10.48550/arXiv.1312.4400 – ident: 13644_CR84 doi: 10.1109/CyberC.2018.00036 – ident: 13644_CR72 doi: 10.1109/SDPC.2018.8664773 – ident: 13644_CR40 doi: 10.48550/arXiv.1405.0312 – ident: 13644_CR56 doi: 10.1109/CVPR.2016.91 – ident: 13644_CR3 – ident: 13644_CR30 doi: 10.1109/DASA51403.2020.9317198 – ident: 13644_CR27 doi: 10.1109/BigData.2018.8621865 – ident: 13644_CR59 doi: 10.21437/Interspeech.2015-350 – volume: 33 start-page: 9259 year: 2019 ident: 13644_CR86 publication-title: Proceed AAAI Conf Artif Intell – volume: 19 start-page: 3371 issue: 15 year: 2019 ident: 13644_CR26 publication-title: Sensors doi: 10.3390/s19153371 – ident: 13644_CR1 doi: 10.48550/arXiv.1809.03193 – volume: 4 start-page: 32 issue: 5 year: 2012 ident: 13644_CR66 publication-title: Int J Inform Technol Comput Sci – volume: 7 start-page: 133529 year: 2019 ident: 13644_CR47 publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2941547 – volume: 42 start-page: 3234 issue: 6 year: 2015 ident: 13644_CR53 publication-title: Expert Syst Appl doi: 10.1016/j.eswa.2014.12.003 – ident: 13644_CR63 doi: 10.1109/ES.2017.35 – ident: 13644_CR45 doi: 10.48550/arXiv.1809.02165 – ident: 13644_CR85 doi: 10.1109/TCSVT.2020.2986402 – ident: 13644_CR48 doi: 10.3390/app7070730 – ident: 13644_CR67 – ident: 13644_CR7 doi: 10.1016/j.eswa.2020.113833 – ident: 13644_CR9 doi: 10.1007/s42835-019-00230-w – volume: 2018 start-page: 1 year: 2018 ident: 13644_CR71 publication-title: Comput Intell Neurosci – ident: 13644_CR10 doi: 10.48550/arXiv.2107.04191 – volume: 11 start-page: 1117 issue: 9 year: 2019 ident: 13644_CR81 publication-title: Remote Sens doi: 10.3390/rs11091117 – ident: 13644_CR73 doi: 10.1109/CVPRW50498.2020.00203 – ident: 13644_CR18 doi: 10.1007/s40747-021-00324-x – volume: 13 start-page: 1909 issue: 10 year: 2021 ident: 13644_CR28 publication-title: Remote Sens doi: 10.3390/rs13101909 – ident: 13644_CR79 doi: 10.1145/3446132.3446400 – ident: 13644_CR57 – volume: 143 start-page: 7 issue: 11 year: 2016 ident: 13644_CR80 publication-title: Int J Comput Appl – volume: 8 start-page: 48299 year: 2020 ident: 13644_CR77 publication-title: IEEE Access doi: 10.1109/ACCESS.2020.2979164 – ident: 13644_CR69 doi: 10.1016/j.comnet.2020.107138 – volume: 19 start-page: 242 issue: 6 year: 2017 ident: 13644_CR2 publication-title: Entropy doi: 10.3390/e19060242 – ident: 13644_CR16 – ident: 13644_CR19 doi: 10.1016/B978-0-12-816718-2.00013-0 – ident: 13644_CR24 doi: 10.1007/s11042-018-6428-0 – ident: 13644_CR87 doi: 10.1051/matecconf/202133603002 – volume: 33 start-page: 1869 issue: 4 year: 2019 ident: 13644_CR52 publication-title: J Mech Sci Technol doi: 10.1007/s12206-019-0339-5 – ident: 13644_CR25 doi: 10.1109/CVPR.2016.90 – ident: 13644_CR68 – ident: 13644_CR22 – ident: 13644_CR35 doi: 10.1155/2018/7075814 – ident: 13644_CR60 – ident: 13644_CR76 doi: 10.1016/j.compag.2020.105742 – ident: 13644_CR5 – volume: 1 start-page: 756 issue: 3 year: 2019 ident: 13644_CR74 publication-title: Mach Learn Knowl Extraction doi: 10.3390/make1030044 – ident: 13644_CR4 doi: 10.1016/j.scs.2020.102589 – volume: 40 start-page: 849 issue: 4 year: 2017 ident: 13644_CR83 publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2017.2695539 – ident: 13644_CR33 doi: 10.1609/aaai.v29i1.9513 – ident: 13644_CR17 doi: 10.1007/s11554-020-00987-8 – ident: 13644_CR34 doi: 10.1109/TENCON.1999.818681 – ident: 13644_CR49 – ident: 13644_CR23 doi: 10.1016/j.patcog.2017.10.013 – ident: 13644_CR78 doi: 10.1109/TCSVT.2018.2867286 – ident: 13644_CR15 doi: 10.1109/CVPR.2008.4587597 – volume: 2 start-page: 1 issue: 1 year: 2018 ident: 13644_CR50 publication-title: Mater Degrad doi: 10.1038/s41529-017-0021-2 – volume: 8 start-page: 1 issue: 1 year: 2018 ident: 13644_CR8 publication-title: Sci Rep doi: 10.1038/s41598-018-24271-9 – ident: 13644_CR29 doi: 10.1109/ACCESS.2019.2939201 – ident: 13644_CR58 – ident: 13644_CR41 doi: 10.48550/arXiv.1708.02002 – volume: 28 start-page: 261 issue: 2 year: 2020 ident: 13644_CR44 publication-title: Int J Comput Vis doi: 10.1007/s11263-019-01247-4 – volume: 25 start-page: 601 issue: 4 year: 2016 ident: 13644_CR82 publication-title: Chin J Electron doi: 10.1049/cje.2016.07.002 – ident: 13644_CR12 – ident: 13644_CR31 doi: 10.1007/s10462-018-9633-3 – ident: 13644_CR6 doi: 10.1145/3219819.3219861 – ident: 13644_CR43 doi: 10.48550/arXiv.1512.02325 – ident: 13644_CR75 doi: 10.3390/en10030406 – volume: 111 start-page: 98 issue: 1 year: 2015 ident: 13644_CR14 publication-title: Int J Comput Vis doi: 10.1007/s11263-014-0733-5 – ident: 13644_CR21 doi: 10.1109/CVPR.2014.81 – ident: 13644_CR65 doi: 10.48550/arXiv.1409.4842 – ident: 13644_CR20 doi: 10.1109/ICCV.2015.169 – ident: 13644_CR61 – volume: 65 start-page: 102600 year: 2021 ident: 13644_CR46 publication-title: Sustain Cities Soc doi: 10.1016/j.scs.2020.102600 – ident: 13644_CR38 doi: 10.1016/j.procs.2017.06.037 – ident: 13644_CR11 doi: 10.1109/SmartGridComm.2018.8587554 – ident: 13644_CR64 doi: 10.48550/arXiv.1409.1556 – ident: 13644_CR55 – volume: 9 start-page: 3750 issue: 18 year: 2019 ident: 13644_CR36 publication-title: Appl Sci doi: 10.3390/app9183750 – ident: 13644_CR42 doi: 10.1109/CVPR.2017.106 – ident: 13644_CR32 doi: 10.1109/PlatCon.2016.7456805 – ident: 13644_CR13 – ident: 13644_CR62 doi: 10.1016/j.jksuci.2019.09.012 |
SSID | ssj0016524 |
Score | 2.7188976 |
Snippet | Object detection is one of the predominant and challenging problems in computer vision. Over the decade, with the expeditious evolution of deep learning,... |
SourceID | pubmedcentral proquest pubmed crossref springer |
SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 9243 |
SubjectTerms | Accuracy Architecture Computer Communication Networks Computer Science Computer vision Data Structures and Information Theory Detectors Inference Machine learning Multimedia Information Systems Object recognition Performance enhancement Proposals Sensors Special Purpose and Application-Based Systems |
SummonAdditionalLinks | – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3da9UwFD_o3YsKflyd1k2J4Jsr3qZJk_oy5tgYIrsiDiYIJUlTFaR3rr0P--89p01773Vsz0lJ2985yS_nE-CtV6Kyynqya8xiQSqlZVnFxmfa5kY4XnZRvqfZyZn4dC7Pg8GtCWGVw57YbdTlwpGN_D1XihRX63z_4m9MXaPIuxpaaNyFLdyCtZ7A1sej0y9fRz9CJkNbWz2L8WxMQtpMnzyXUGoKRbMnKbKC-GrzaLrGN6-HTf7nO-2OpOPH8DBwSXbQg_8E7vh6Cg8O1lwDU3g0tG1gQYuncH-tBuFT-DG3ZIphpW-7qKyaUSj8T_Z9_nn-gbmh10qzx9Z8Drhqs-xaLS4ucYTCTBvfNszUJVt3iT-Ds-Ojb4cncWi5EDuhRBuXSZnlViNIeJNJc8Quc4rSfKQ3ypTWKe3xgqSVFdL53EquE-OcSIUzeHNx6TZM6kXtXwBzlUf6YQ1HDiKQeBjkcqaqOMqAkVyWESTD3y5cqEdObTH-FKtKyoRQgQgVHULFVQTvxmcu-moct87eHUAsgmY2xUqOIngzDqNOkaPE1H6xpDldpTfBZxE87zEfl0tlnmmRiAjUhjSME6he9-ZI_ftXV7ebXM6p4hHsDXKzeq2bv-Ll7V-xA_c48q4-LG4XJu3l0r9CntTa10EZ_gEEoBBc priority: 102 providerName: ProQuest |
Title | Object detection using YOLO: challenges, architectural successors, datasets and applications |
URI | https://link.springer.com/article/10.1007/s11042-022-13644-y https://www.ncbi.nlm.nih.gov/pubmed/35968414 https://www.proquest.com/docview/2778762889 https://www.proquest.com/docview/2702485420 https://pubmed.ncbi.nlm.nih.gov/PMC9358372 |
Volume | 82 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fb9MwED6x7QUeGIxfgVEZiTcWqUns2Nlbi9pNgFqEqLRJSJHtOGwSStGSPuy_585N0pYBEk9-sBMn-XL2ne-7O4C3TvLSSOPoXGMYchIpJYoy1C5VJtPcxoVn-c7S8wX_cCEu2qCwumO7dy5Jv1Jvgt0iCiUh9nmU4C4e3u7BgUDbnYhci3jU-w5S0ZayVcMQ98OoDZX58z12t6M7OuZdquRv_lK_DU0fwcNWf2SjNeCP4Z6rjuCwq83AWlE9ggdbiQafwLe5ofMWVrjGU68qRnz37-xy_ml-ymxXUKU-YVuOBZymXvl6issb7CEuae2amumqYNt-76ewmE6-vj8P27oKoeWSN2ERFWlmFCKB5kqSIUCplRTLI5yWujBWKodWkJKGC-syI2IVaWt5wq1G88Qmz2C_WlbuBTBbOtQxjI5R0eCoXWhU2HRZxgi0FrEoAoi6z5vbNuk41b74kW_SJRMkOUKSe0jy2wDe9df8XKfc-Ofo4w61vBW_Oo-lpFVeqSyAN303Cg55Q3Tllisa49O58XgYwPM1yP10ichSxSMegNyBvx9ASbl3e6rrK5-cm_zKiYwDOOl-lM1j_f0tXv7f8Fdwn8rer7lwx7Df3Kzca1SOGjOAPTU9G8DBaDoez6g9u_w4wXY8mX3-MvCS8gtlfwy3 |
linkProvider | Springer Nature |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1bb9MwFD4a44GLxKXcAgOMBE8sonGc2EFCaAJKx8r6skmbhBRsxwEklI4lFeqf4jdyTm5tmdjbnu3ETs7ts88N4LmTIjfSOLrXGPqCREpFWe5rFyuTaGF5Vkf57sfjQ_HpKDragD9dLgyFVXY6sVbU2czSHfkrLiUJrlLJ25NfPnWNIu9q10KjYYs9t_iNR7byze57pO8LzkcfDt6N_bargG-FFJWfBVmcGIX7QLAeJri92ErKZImcljozViqHZwAljYisS0zEVaCtFaGwGsG5DfG9l-CyCNGSU2b66GPvtYijtomuGvpoiYM2SadJ1QsoEYZi54MQMYi_WDeEZ9Dt2SDNfzy1tQEc3YIbLXJlOw2r3YYNVwzg-s6KI2IAN7smEazVGQO4tlLx8A58mRq6-GGZq-oYsIJR4P03djydTF8z23V2KbfZiocDVy3ndWPH2SmOUFBr6aqS6SJjqw74u3B4IaS4B5vFrHAPgNncIdgxmiPiEQhzNCJHneccOU5HPMo8CLq_ndq2-jk14fiZLus2E4VSpFBaUyhdePCyf-akqf1x7uytjohpqwfKdMm1Hjzrh1GCyS2jCzeb05y6rpzgQw_uNzTvlwujJFYiEB7INW7oJ1B18PWR4sf3uko4ObhDyT3Y7vhmua3_f8XD87_iKVwZH3yepJPd_b1HcJUj4msC8rZgszqdu8eI0CrzpBYLBl8vWg7_AulNSxY |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1ta9RAEB7qFUQFX863aNUV9JMNvWw22Y0gUm2P1pa7IhYqCHF3s1FBcrXJIffX_HXO5OVeLPZbP-8mu8nM7D6788wMwAsnRW6kcXSvMfAFmZSKstzXLlYm0cLyrGb5juK9Y_HhJDpZgz9dLAzRKrs1sV6os4mlO_ItLiUZrlLJVt7SIo52hm9Pf_lUQYo8rV05jUZFDtzsNx7fyjf7Oyjrl5wPdz-93_PbCgO-FVJUfhZkcWIUzgmBe5jgVGMrKaolclrqzFipHJ4HlDQisi4xEVeBtlaEwmoE6jbE916BdUmnoh6sv9sdHX2c-zDiqC2pqwY-7stBG7LTBO4FFBZDTPogRETiz1a3xXNY9zxl8x-_bb0dDm_DzRbHsu1G8e7Amiv6cGN7yS3Rh1tdyQjWriB9uL6U__AufBkbugZimatqRljBiIb_jX0eH45fM9vVeSk32ZK_A0ctp3WZx8kZthDFtXRVyXSRsWV3_D04vhRh3IdeMSncQ2A2dwh9jOaIfwSCHo04Uuc5R_3TEY8yD4Lub6e2zYVOJTl-possziShFCWU1hJKZx68mj9z2mQCubD3RifEtF0VynShwx48nzejPZOTRhduMqU-dZY5wQcePGhkPh8ujJJYiUB4IFe0Yd6BcoWvthQ_vtc5w8ndHUruwWanN4tp_f8rHl38Fc_gKtpgerg_OngM1zjCv4adtwG96mzqniBcq8zT1i4YfL1sU_wLA35QqA |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Object+detection+using+YOLO%3A+challenges%2C+architectural+successors%2C+datasets+and+applications&rft.jtitle=Multimedia+tools+and+applications&rft.au=Diwan%2C+Tausif&rft.au=Anirudh%2C+G&rft.au=Tembhurne%2C+Jitendra+V&rft.date=2023-03-01&rft.issn=1380-7501&rft.volume=82&rft.issue=6&rft.spage=9243&rft_id=info:doi/10.1007%2Fs11042-022-13644-y&rft_id=info%3Apmid%2F35968414&rft.externalDocID=35968414 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1380-7501&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1380-7501&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1380-7501&client=summon |