UAV-YOLOv8: A Small-Object-Detection Model Based on Improved YOLOv8 for UAV Aerial Photography Scenarios

Unmanned aerial vehicle (UAV) object detection plays a crucial role in civil, commercial, and military domains. However, the high proportion of small objects in UAV images and the limited platform resources lead to the low accuracy of most of the existing detection models embedded in UAVs, and it is...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 16; p. 7190
Main Authors Wang, Gang, Chen, Yanfei, An, Pei, Hong, Hanyu, Hu, Jinghu, Huang, Tiange
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.08.2023
MDPI
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s23167190

Cover

Loading…
Abstract Unmanned aerial vehicle (UAV) object detection plays a crucial role in civil, commercial, and military domains. However, the high proportion of small objects in UAV images and the limited platform resources lead to the low accuracy of most of the existing detection models embedded in UAVs, and it is difficult to strike a good balance between detection performance and resource consumption. To alleviate the above problems, we optimize YOLOv8 and propose an object detection model based on UAV aerial photography scenarios, called UAV-YOLOv8. Firstly, Wise-IoU (WIoU) v3 is used as a bounding box regression loss, and a wise gradient allocation strategy makes the model focus more on common-quality samples, thus improving the localization ability of the model. Secondly, an attention mechanism called BiFormer is introduced to optimize the backbone network, which improves the model’s attention to critical information. Finally, we design a feature processing module named Focal FasterNet block (FFNB) and propose two new detection scales based on this module, which makes the shallow features and deep features fully integrated. The proposed multiscale feature fusion network substantially increased the detection performance of the model and reduces the missed detection rate of small objects. The experimental results show that our model has fewer parameters compared to the baseline model and has a mean detection accuracy higher than the baseline model by 7.7%. Compared with other mainstream models, the overall performance of our model is much better. The proposed method effectively improves the ability to detect small objects. There is room to optimize the detection effectiveness of our model for small and feature-less objects (such as bicycle-type vehicles), as we will address in subsequent research.
AbstractList Unmanned aerial vehicle (UAV) object detection plays a crucial role in civil, commercial, and military domains. However, the high proportion of small objects in UAV images and the limited platform resources lead to the low accuracy of most of the existing detection models embedded in UAVs, and it is difficult to strike a good balance between detection performance and resource consumption. To alleviate the above problems, we optimize YOLOv8 and propose an object detection model based on UAV aerial photography scenarios, called UAV-YOLOv8. Firstly, Wise-IoU (WIoU) v3 is used as a bounding box regression loss, and a wise gradient allocation strategy makes the model focus more on common-quality samples, thus improving the localization ability of the model. Secondly, an attention mechanism called BiFormer is introduced to optimize the backbone network, which improves the model’s attention to critical information. Finally, we design a feature processing module named Focal FasterNet block (FFNB) and propose two new detection scales based on this module, which makes the shallow features and deep features fully integrated. The proposed multiscale feature fusion network substantially increased the detection performance of the model and reduces the missed detection rate of small objects. The experimental results show that our model has fewer parameters compared to the baseline model and has a mean detection accuracy higher than the baseline model by 7.7%. Compared with other mainstream models, the overall performance of our model is much better. The proposed method effectively improves the ability to detect small objects. There is room to optimize the detection effectiveness of our model for small and feature-less objects (such as bicycle-type vehicles), as we will address in subsequent research.
Unmanned aerial vehicle (UAV) object detection plays a crucial role in civil, commercial, and military domains. However, the high proportion of small objects in UAV images and the limited platform resources lead to the low accuracy of most of the existing detection models embedded in UAVs, and it is difficult to strike a good balance between detection performance and resource consumption. To alleviate the above problems, we optimize YOLOv8 and propose an object detection model based on UAV aerial photography scenarios, called UAV-YOLOv8. Firstly, Wise-IoU (WIoU) v3 is used as a bounding box regression loss, and a wise gradient allocation strategy makes the model focus more on common-quality samples, thus improving the localization ability of the model. Secondly, an attention mechanism called BiFormer is introduced to optimize the backbone network, which improves the model's attention to critical information. Finally, we design a feature processing module named Focal FasterNet block (FFNB) and propose two new detection scales based on this module, which makes the shallow features and deep features fully integrated. The proposed multiscale feature fusion network substantially increased the detection performance of the model and reduces the missed detection rate of small objects. The experimental results show that our model has fewer parameters compared to the baseline model and has a mean detection accuracy higher than the baseline model by 7.7%. Compared with other mainstream models, the overall performance of our model is much better. The proposed method effectively improves the ability to detect small objects. There is room to optimize the detection effectiveness of our model for small and feature-less objects (such as bicycle-type vehicles), as we will address in subsequent research.Unmanned aerial vehicle (UAV) object detection plays a crucial role in civil, commercial, and military domains. However, the high proportion of small objects in UAV images and the limited platform resources lead to the low accuracy of most of the existing detection models embedded in UAVs, and it is difficult to strike a good balance between detection performance and resource consumption. To alleviate the above problems, we optimize YOLOv8 and propose an object detection model based on UAV aerial photography scenarios, called UAV-YOLOv8. Firstly, Wise-IoU (WIoU) v3 is used as a bounding box regression loss, and a wise gradient allocation strategy makes the model focus more on common-quality samples, thus improving the localization ability of the model. Secondly, an attention mechanism called BiFormer is introduced to optimize the backbone network, which improves the model's attention to critical information. Finally, we design a feature processing module named Focal FasterNet block (FFNB) and propose two new detection scales based on this module, which makes the shallow features and deep features fully integrated. The proposed multiscale feature fusion network substantially increased the detection performance of the model and reduces the missed detection rate of small objects. The experimental results show that our model has fewer parameters compared to the baseline model and has a mean detection accuracy higher than the baseline model by 7.7%. Compared with other mainstream models, the overall performance of our model is much better. The proposed method effectively improves the ability to detect small objects. There is room to optimize the detection effectiveness of our model for small and feature-less objects (such as bicycle-type vehicles), as we will address in subsequent research.
Audience Academic
Author Huang, Tiange
Hong, Hanyu
Wang, Gang
Chen, Yanfei
Hu, Jinghu
An, Pei
AuthorAffiliation Hubei Key Laboratory of Optical Information and Pattern Recognition, School of Electrical and Information Engineering, Wuhan Institute of Technology, Wuhan 430205, China; wanggang@stu.wit.edu.cn (G.W.); anpei@wit.edu.cn (P.A.); hhyhong@wit.edu.cn (H.H.); jinhuhu@stu.wit.edu.cn (J.H.); huangtg@wit.edu.cn (T.H.)
AuthorAffiliation_xml – name: Hubei Key Laboratory of Optical Information and Pattern Recognition, School of Electrical and Information Engineering, Wuhan Institute of Technology, Wuhan 430205, China; wanggang@stu.wit.edu.cn (G.W.); anpei@wit.edu.cn (P.A.); hhyhong@wit.edu.cn (H.H.); jinhuhu@stu.wit.edu.cn (J.H.); huangtg@wit.edu.cn (T.H.)
Author_xml – sequence: 1
  givenname: Gang
  surname: Wang
  fullname: Wang, Gang
– sequence: 2
  givenname: Yanfei
  surname: Chen
  fullname: Chen, Yanfei
– sequence: 3
  givenname: Pei
  surname: An
  fullname: An, Pei
– sequence: 4
  givenname: Hanyu
  surname: Hong
  fullname: Hong, Hanyu
– sequence: 5
  givenname: Jinghu
  surname: Hu
  fullname: Hu, Jinghu
– sequence: 6
  givenname: Tiange
  surname: Huang
  fullname: Huang, Tiange
BookMark eNplUk1vGyEQRVWqJnF76D9YqZf2sAkLLLC9VNv0y5IrV0pTqSfEsmBjsYsLa0v59x3HadWk4jAwvPeGN8w5OhnjaBF6WeELSht8mQmtuKga_ASdVYywUhKCT_7Zn6LznDcYE0qpfIZOqeC0EkScofVN-6P8uVws9_Jt0RbXgw6hXHYba6byg50g-DgWX2NvQ_FeZ9sXcJwP2xT3sD8SCxdTATpFa5PXofi2jlNcJb1d3xbXxo46-Zifo6dOh2xf3McZuvn08fvVl3Kx_Dy_ahelYZJPJZO4qxkzzlHbU2up07U7RFYL2rGmZh0lmEhNSCUacGQq3nVEk0Y4xmhDZ2h-1O2j3qht8oNOtypqr-4SMa2UTpM3wSpKmDSCuMbxjllTa8EbXUEZxy1zWIPWu6PWdtcNtgcrU9LhgejDm9Gv1SruVYVZLSUWoPD6XiHFXzubJzX4bGwIerRxlxWRtZCMsgYD9NUj6Cbu0gi9ukMxxiWXgLo4olYaHPjRRShsYPV28AbGwnnIt4KTGloJHZmhyyPBpJhzsk4ZP-nDrwLRB3iqOsyQ-jtDwHjziPHH8P_Y3wMIw5U
CitedBy_id crossref_primary_10_1016_j_eswa_2024_126206
crossref_primary_10_1007_s10341_024_01085_w
crossref_primary_10_3390_electronics13122383
crossref_primary_10_3390_drones8110691
crossref_primary_10_1038_s41598_024_68446_z
crossref_primary_10_1109_TCPMT_2024_3491163
crossref_primary_10_3390_drones9040230
crossref_primary_10_3390_s24216955
crossref_primary_10_3390_app14167073
crossref_primary_10_1016_j_jksuci_2024_102113
crossref_primary_10_3390_aerospace11050392
crossref_primary_10_1016_j_aquaculture_2025_742395
crossref_primary_10_1016_j_imavis_2024_105276
crossref_primary_10_1016_j_autcon_2025_106108
crossref_primary_10_1038_s41598_024_83241_6
crossref_primary_10_1049_ipr2_13274
crossref_primary_10_1016_j_wasman_2024_11_002
crossref_primary_10_1109_JRFID_2024_3384483
crossref_primary_10_1109_TGRS_2024_3443856
crossref_primary_10_3390_electronics14010093
crossref_primary_10_1016_j_ajodo_2024_03_012
crossref_primary_10_1007_s00217_024_04516_w
crossref_primary_10_1109_LGRS_2024_3432329
crossref_primary_10_1186_s44147_025_00584_1
crossref_primary_10_1016_j_compag_2024_109481
crossref_primary_10_1016_j_asoc_2024_112329
crossref_primary_10_1016_j_rsase_2024_101260
crossref_primary_10_1038_s41598_024_71238_0
crossref_primary_10_1007_s11760_024_03661_9
crossref_primary_10_1016_j_heliyon_2024_e37605
crossref_primary_10_1016_j_measurement_2024_116624
crossref_primary_10_3390_agronomy14112734
crossref_primary_10_61186_itrc_16_4_9
crossref_primary_10_3390_agronomy15010151
crossref_primary_10_1049_esi2_12158
crossref_primary_10_1016_j_heliyon_2024_e34782
crossref_primary_10_20295_2412_9186_2024_10_03_254_268
crossref_primary_10_1016_j_imavis_2024_105054
crossref_primary_10_3390_s24196209
crossref_primary_10_3390_electronics13173500
crossref_primary_10_3390_en17143518
crossref_primary_10_3390_s24082483
crossref_primary_10_3390_rs16173338
crossref_primary_10_3390_bdcc8010009
crossref_primary_10_3390_s24196330
crossref_primary_10_3233_IDT_240040
crossref_primary_10_3390_buildings14071929
crossref_primary_10_3390_s25051595
crossref_primary_10_3390_e27020165
crossref_primary_10_3390_app142411926
crossref_primary_10_1016_j_imavis_2025_105469
crossref_primary_10_3390_agriculture14081359
crossref_primary_10_3390_en17174359
crossref_primary_10_1016_j_eswa_2024_125830
crossref_primary_10_1016_j_envpol_2024_124292
crossref_primary_10_3390_s25010214
crossref_primary_10_1007_s11554_024_01514_9
crossref_primary_10_1109_ACCESS_2024_3355018
crossref_primary_10_3390_fire8020066
crossref_primary_10_1109_ACCESS_2025_3547825
crossref_primary_10_3788_LOP241149
crossref_primary_10_3390_s23218723
crossref_primary_10_3390_a17120595
crossref_primary_10_1016_j_engappai_2024_109686
crossref_primary_10_3389_fpls_2024_1387350
crossref_primary_10_3390_app14209156
crossref_primary_10_1117_1_JEI_34_2_023014
crossref_primary_10_3390_electronics14050989
crossref_primary_10_1038_s41598_025_89124_8
crossref_primary_10_1109_JSTARS_2024_3427017
crossref_primary_10_1109_ACCESS_2024_3403491
crossref_primary_10_1002_ima_23130
crossref_primary_10_3390_electronics14010054
crossref_primary_10_3390_drones9030214
crossref_primary_10_3390_electronics13020305
crossref_primary_10_3390_vision8030048
crossref_primary_10_1109_ACCESS_2025_3546622
crossref_primary_10_1007_s11517_024_03187_9
crossref_primary_10_3390_app14031100
crossref_primary_10_3390_nano14131115
crossref_primary_10_3390_agronomy15010187
crossref_primary_10_1007_s00607_024_01379_7
crossref_primary_10_3390_electronics13112149
crossref_primary_10_1038_s41598_024_81430_x
crossref_primary_10_32604_cmc_2024_048998
crossref_primary_10_3390_app14156710
crossref_primary_10_7717_peerj_cs_2271
crossref_primary_10_1016_j_imavis_2025_105485
crossref_primary_10_1007_s11227_024_06527_6
crossref_primary_10_1371_journal_pone_0310818
crossref_primary_10_3390_rs17050745
crossref_primary_10_1016_j_compag_2024_109475
crossref_primary_10_1109_ACCESS_2024_3495540
crossref_primary_10_1007_s00371_024_03796_3
crossref_primary_10_1007_s11042_024_20471_w
crossref_primary_10_3390_drones8120750
crossref_primary_10_1016_j_asej_2024_103227
crossref_primary_10_1007_s13369_025_09997_9
crossref_primary_10_3390_biomimetics9100647
crossref_primary_10_3390_electronics13091706
crossref_primary_10_3390_electronics13203989
crossref_primary_10_4236_jcc_2025_133006
crossref_primary_10_1016_j_neucom_2024_127685
crossref_primary_10_3390_agriculture14071125
crossref_primary_10_1038_s41598_024_78598_7
crossref_primary_10_1134_S1064562424601951
crossref_primary_10_3390_pr12061211
crossref_primary_10_1007_s11554_024_01599_2
crossref_primary_10_1049_ipr2_13300
crossref_primary_10_3389_fpls_2024_1492504
crossref_primary_10_3390_electronics13061068
crossref_primary_10_4018_JOEUC_338214
crossref_primary_10_1109_JSEN_2024_3524537
crossref_primary_10_1038_s41598_025_92344_7
crossref_primary_10_3390_rs16132416
crossref_primary_10_3390_rs16163046
crossref_primary_10_3390_s24092896
crossref_primary_10_3390_s24144747
crossref_primary_10_3788_LOP240932
crossref_primary_10_3390_buildings14123883
crossref_primary_10_58769_joinssr_1542886
crossref_primary_10_3390_buildings14020531
crossref_primary_10_3390_s24113553
crossref_primary_10_1186_s40537_024_00941_6
crossref_primary_10_1016_j_jnlest_2025_100300
crossref_primary_10_9728_dcs_2024_25_6_1525
crossref_primary_10_54939_1859_1043_j_mst_FEE_2024_65_71
crossref_primary_10_3390_s24144751
crossref_primary_10_3390_s24186030
crossref_primary_10_1016_j_measurement_2024_114975
crossref_primary_10_3390_app132011344
crossref_primary_10_3390_rs16163057
crossref_primary_10_1038_s41598_024_84747_9
crossref_primary_10_1109_JSTARS_2024_3474689
crossref_primary_10_3390_rs16203810
crossref_primary_10_1109_ACCESS_2025_3546946
crossref_primary_10_1109_ACCESS_2025_3547914
crossref_primary_10_1016_j_engappai_2025_110111
crossref_primary_10_3233_IDA_230929
crossref_primary_10_3390_su162310172
crossref_primary_10_1007_s11554_024_01592_9
crossref_primary_10_3390_s24123918
crossref_primary_10_3390_electronics13112182
crossref_primary_10_1016_j_plaphy_2024_108769
crossref_primary_10_1109_TIM_2024_3396833
crossref_primary_10_3390_jimaging10100248
crossref_primary_10_1002_tee_24221
crossref_primary_10_3390_s24248134
crossref_primary_10_3390_plants13172435
crossref_primary_10_1371_journal_pone_0306436
crossref_primary_10_1016_j_jobe_2024_111046
crossref_primary_10_1109_ACCESS_2024_3396224
crossref_primary_10_3390_s24103064
crossref_primary_10_1016_j_engappai_2025_110488
crossref_primary_10_3390_rs16030600
crossref_primary_10_3390_rs17020346
crossref_primary_10_3390_app14177703
crossref_primary_10_1007_s11554_024_01485_x
crossref_primary_10_1049_ipr2_13314
crossref_primary_10_3390_electronics13234837
crossref_primary_10_3390_su16208954
crossref_primary_10_1049_ell2_70206
crossref_primary_10_3390_su16114759
crossref_primary_10_3390_electronics13234824
crossref_primary_10_3390_rs16142590
crossref_primary_10_3390_electronics13091620
crossref_primary_10_1016_j_atech_2024_100720
crossref_primary_10_12677_MOS_2024_131028
crossref_primary_10_5194_amt_17_3765_2024
crossref_primary_10_1007_s11760_025_03901_6
crossref_primary_10_1016_j_rineng_2025_104045
crossref_primary_10_1016_j_psj_2024_104289
crossref_primary_10_3390_fire8030104
crossref_primary_10_3390_drones8090453
crossref_primary_10_1038_s41598_024_81201_8
crossref_primary_10_1109_ACCESS_2025_3550539
crossref_primary_10_3390_f15071096
crossref_primary_10_1016_j_psj_2024_104281
crossref_primary_10_1007_s11042_024_18866_w
crossref_primary_10_1016_j_ecoinf_2024_102691
crossref_primary_10_1088_1361_6501_ada0d1
crossref_primary_10_3390_ani14233415
crossref_primary_10_3390_plants14050786
crossref_primary_10_3390_rs16010025
crossref_primary_10_3390_drones8060226
crossref_primary_10_1007_s42401_025_00352_2
crossref_primary_10_3390_s24072321
crossref_primary_10_3390_rs16132465
crossref_primary_10_32604_cmc_2025_060873
crossref_primary_10_1088_1361_6501_ad866a
crossref_primary_10_1109_ACCESS_2024_3459868
crossref_primary_10_1111_tgis_70021
crossref_primary_10_1038_s41598_025_85488_z
crossref_primary_10_3389_fpls_2024_1409194
crossref_primary_10_1109_TFUZZ_2024_3370995
crossref_primary_10_3390_rs16224175
crossref_primary_10_1007_s11227_024_06703_8
crossref_primary_10_1109_JIOT_2024_3435130
crossref_primary_10_3390_agriculture14101789
crossref_primary_10_1016_j_measurement_2024_115587
crossref_primary_10_1007_s13369_024_09419_2
crossref_primary_10_14358_PERS_24_00065R2
crossref_primary_10_3390_s24185945
crossref_primary_10_3390_app14198595
crossref_primary_10_3390_rs17071118
crossref_primary_10_3390_rs17040685
crossref_primary_10_1186_s13007_024_01238_8
crossref_primary_10_1038_s41598_025_88089_y
crossref_primary_10_3390_app142311293
crossref_primary_10_1088_1742_6596_2816_1_012067
crossref_primary_10_3389_fnbot_2024_1430155
crossref_primary_10_1109_ACCESS_2024_3426040
crossref_primary_10_1109_ACCESS_2024_3356048
crossref_primary_10_3390_s25010196
crossref_primary_10_1016_j_aiia_2024_07_001
crossref_primary_10_1007_s11554_024_01519_4
crossref_primary_10_1007_s10499_024_01422_6
crossref_primary_10_3390_app14188357
crossref_primary_10_3390_app15020924
crossref_primary_10_3389_fpls_2024_1348402
crossref_primary_10_1007_s11554_023_01405_5
crossref_primary_10_3390_s25020436
crossref_primary_10_1007_s40430_024_05370_3
crossref_primary_10_1109_ACCESS_2024_3439230
crossref_primary_10_1016_j_aej_2024_11_064
crossref_primary_10_1016_j_inffus_2024_102647
crossref_primary_10_1111_jph_13433
crossref_primary_10_3390_rs16050906
crossref_primary_10_3390_drones9030159
crossref_primary_10_1016_j_infrared_2024_105487
crossref_primary_10_3390_pathogens13121032
crossref_primary_10_1088_1361_6501_ad71e5
crossref_primary_10_1007_s10586_024_04474_8
crossref_primary_10_1364_OE_528687
crossref_primary_10_1016_j_displa_2024_102903
Cites_doi 10.3390/rs14195063
10.1109/CVPR.2018.00644
10.1109/CVPR.2017.690
10.3390/drones6100308
10.1109/CVPR46437.2021.01146
10.1109/CVPR.2019.00075
10.20944/preprints202305.0796.v1
10.3390/rs15030865
10.1109/ICCV.2017.74
10.1109/CVPR52729.2023.01157
10.1109/ACCESS.2020.3014910
10.1109/CVPR.2016.91
10.3390/s23063061
10.1109/UV56588.2022.10185474
10.3390/s23073634
10.3390/s20082238
10.1109/CVPR42600.2020.00165
10.3390/electronics11152330
10.1109/CVPR52729.2023.01291
10.1109/TPAMI.2021.3119563
10.3390/s23083847
10.3390/s23135786
10.1007/978-3-030-01234-2_1
10.1016/j.neucom.2022.07.042
10.1109/CVPR.2018.00913
10.1109/ICCV.2017.324
10.1109/ICCV.2015.169
10.1109/CVPR.2016.90
10.1109/ICCV48922.2021.00349
10.1109/TPAMI.2015.2389824
10.1007/978-3-030-58452-8_13
10.1016/j.ijepes.2022.108054
10.1109/CVPR.2019.00093
10.1609/aaai.v34i07.6999
10.3390/rs13204027
10.1109/TPAMI.2009.167
10.1109/CVPR.2014.81
10.1109/CVPR.2018.00716
10.1007/s10586-022-03627-x
10.1109/CVPR42600.2020.00978
10.1109/TPAMI.2016.2577031
10.1109/CVPRW50498.2020.00203
10.1007/978-3-319-46448-0_2
10.1016/j.isprsjprs.2021.01.008
10.1109/CVPR52729.2023.00995
ContentType Journal Article
Copyright COPYRIGHT 2023 MDPI AG
2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2023 by the authors. 2023
Copyright_xml – notice: COPYRIGHT 2023 MDPI AG
– notice: 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2023 by the authors. 2023
DBID AAYXX
CITATION
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
7X8
5PM
DOA
DOI 10.3390/s23167190
DatabaseName CrossRef
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
ProQuest Health & Medical Collection
Medical Database
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Central
ProQuest Health & Medical Complete
Health Research Premium Collection
ProQuest Medical Library
ProQuest One Academic UKI Edition
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Medical Library (Alumni)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList

CrossRef

Publicly Available Content Database
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_3248c72f9f6b4ec5a769a173bf6e4f0a
PMC10458807
A762548044
10_3390_s23167190
GrantInformation_xml – fundername: Graduate Innovative Fund of Wuhan Institute of Technology
  grantid: CX2022148
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFKRA
AFZYC
ALIPV
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
IAO
ITC
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
PMFND
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PJZUB
PKEHL
PPXIY
PQEST
PQUKI
7X8
5PM
PUEGO
ID FETCH-LOGICAL-c486t-480b544cff3ed3ee3fa5f3ee34573b4954b32028a22179002c16bb2a297f44393
IEDL.DBID 8FG
ISSN 1424-8220
IngestDate Wed Aug 27 01:31:09 EDT 2025
Thu Aug 21 18:36:37 EDT 2025
Thu Jul 10 23:36:45 EDT 2025
Fri Jul 25 03:10:25 EDT 2025
Tue Jun 10 21:29:30 EDT 2025
Tue Jul 01 01:20:19 EDT 2025
Thu Apr 24 23:09:47 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 16
Language English
License https://creativecommons.org/licenses/by/4.0
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c486t-480b544cff3ed3ee3fa5f3ee34573b4954b32028a22179002c16bb2a297f44393
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
OpenAccessLink https://www.proquest.com/docview/2857446868?pq-origsite=%requestingapplication%
PMID 37631727
PQID 2857446868
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_3248c72f9f6b4ec5a769a173bf6e4f0a
pubmedcentral_primary_oai_pubmedcentral_nih_gov_10458807
proquest_miscellaneous_2857843490
proquest_journals_2857446868
gale_infotracacademiconefile_A762548044
crossref_citationtrail_10_3390_s23167190
crossref_primary_10_3390_s23167190
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2023-08-01
PublicationDateYYYYMMDD 2023-08-01
PublicationDate_xml – month: 08
  year: 2023
  text: 2023-08-01
  day: 01
PublicationDecade 2020
PublicationPlace Basel
PublicationPlace_xml – name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationYear 2023
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References ref_50
ref_14
ref_13
ref_12
ref_11
ref_10
ref_54
ref_53
ref_52
ref_51
ref_19
ref_18
ref_17
ref_16
ref_15
He (ref_32) 2015; 37
Felzenszwalb (ref_4) 2010; 32
ref_25
ref_24
ref_23
ref_21
ref_29
ref_27
ref_26
Zhang (ref_37) 2022; 506
ref_36
ref_35
ref_34
ref_33
Bouguettaya (ref_3) 2022; 26
ref_31
ref_30
ref_39
ref_38
Deng (ref_20) 2022; 139
ref_47
ref_45
Liu (ref_28) 2020; 8
ref_44
ref_43
ref_42
ref_41
Zhu (ref_46) 2021; 44
ref_40
ref_1
Zheng (ref_22) 2021; 173
ref_2
ref_49
ref_48
ref_9
ref_8
Ren (ref_7) 2017; 39
ref_5
ref_6
References_xml – ident: ref_17
  doi: 10.3390/rs14195063
– ident: ref_49
  doi: 10.1109/CVPR.2018.00644
– ident: ref_9
  doi: 10.1109/CVPR.2017.690
– ident: ref_51
– ident: ref_18
  doi: 10.3390/drones6100308
– ident: ref_34
  doi: 10.1109/CVPR46437.2021.01146
– ident: ref_47
  doi: 10.1109/CVPR.2019.00075
– ident: ref_41
  doi: 10.20944/preprints202305.0796.v1
– ident: ref_1
  doi: 10.3390/rs15030865
– ident: ref_54
  doi: 10.1109/ICCV.2017.74
– ident: ref_29
  doi: 10.1109/CVPR52729.2023.01157
– volume: 8
  start-page: 145740
  year: 2020
  ident: ref_28
  article-title: Small-object detection in UAV-captured images via multi-branch parallel feature pyramid networks
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2020.3014910
– ident: ref_8
  doi: 10.1109/CVPR.2016.91
– ident: ref_31
– ident: ref_40
  doi: 10.3390/s23063061
– ident: ref_13
  doi: 10.1109/UV56588.2022.10185474
– ident: ref_39
  doi: 10.3390/s23073634
– ident: ref_23
  doi: 10.3390/s20082238
– ident: ref_10
– ident: ref_45
  doi: 10.1109/CVPR42600.2020.00165
– ident: ref_25
  doi: 10.3390/electronics11152330
– ident: ref_19
  doi: 10.1109/CVPR52729.2023.01291
– ident: ref_38
– volume: 44
  start-page: 7380
  year: 2021
  ident: ref_46
  article-title: Detection and Tracking Meet Drones Challenge
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2021.3119563
– ident: ref_42
  doi: 10.3390/s23083847
– ident: ref_26
  doi: 10.3390/s23135786
– ident: ref_27
  doi: 10.1007/978-3-030-01234-2_1
– volume: 506
  start-page: 146
  year: 2022
  ident: ref_37
  article-title: Focal and efficient IOU loss for accurate bounding box regression
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2022.07.042
– ident: ref_33
  doi: 10.1109/CVPR.2018.00913
– ident: ref_50
  doi: 10.1109/ICCV.2017.324
– ident: ref_6
  doi: 10.1109/ICCV.2015.169
– ident: ref_24
  doi: 10.1109/CVPR.2016.90
– ident: ref_11
– ident: ref_36
  doi: 10.1109/ICCV48922.2021.00349
– volume: 37
  start-page: 1904
  year: 2015
  ident: ref_32
  article-title: Spatial Pyramid Pooling in Deep Convolutional Networks for Visual Recognition
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell
  doi: 10.1109/TPAMI.2015.2389824
– ident: ref_16
  doi: 10.1007/978-3-030-58452-8_13
– volume: 139
  start-page: 108054
  year: 2022
  ident: ref_20
  article-title: Research on edge intelligent recognition method oriented to transmission line insulator fault detection
  publication-title: Int. J. Electr. Power Energy Syst.
  doi: 10.1016/j.ijepes.2022.108054
– ident: ref_52
  doi: 10.1109/CVPR.2019.00093
– ident: ref_21
– ident: ref_35
  doi: 10.1609/aaai.v34i07.6999
– ident: ref_2
  doi: 10.3390/rs13204027
– volume: 32
  start-page: 1627
  year: 2010
  ident: ref_4
  article-title: Object Detection with Discriminatively Trained Part-Based Models
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2009.167
– ident: ref_12
– ident: ref_5
  doi: 10.1109/CVPR.2014.81
– ident: ref_44
  doi: 10.1109/CVPR.2018.00716
– volume: 26
  start-page: 1297
  year: 2022
  ident: ref_3
  article-title: A survey on deep learning-based identification of plant and crop diseases from UAV-based aerial images
  publication-title: Cluster. Comput.
  doi: 10.1007/s10586-022-03627-x
– ident: ref_53
  doi: 10.1109/CVPR42600.2020.00978
– ident: ref_15
– volume: 39
  start-page: 1137
  year: 2017
  ident: ref_7
  article-title: Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2016.2577031
– ident: ref_48
  doi: 10.1109/CVPRW50498.2020.00203
– ident: ref_14
  doi: 10.1007/978-3-319-46448-0_2
– volume: 173
  start-page: 95
  year: 2021
  ident: ref_22
  article-title: Growing status observation for oil palm trees using Unmanned Aerial Vehicle (UAV) images
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2021.01.008
– ident: ref_43
– ident: ref_30
  doi: 10.1109/CVPR52729.2023.00995
SSID ssj0023338
Score 2.7368405
Snippet Unmanned aerial vehicle (UAV) object detection plays a crucial role in civil, commercial, and military domains. However, the high proportion of small objects...
SourceID doaj
pubmedcentral
proquest
gale
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Enrichment Source
Index Database
StartPage 7190
SubjectTerms Accuracy
Aerial photography
Algorithms
BiFormer
Computational linguistics
Design
Drone aircraft
FasterNet
Language processing
Natural language interfaces
Photography
Research methodology
small-object detection
UAVs
WIoU
YOLOv8
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELZQT-0BQSlioVQGVYKL1cSP2OktfalCFYtUFpWTZXttLdKSRWzK7-9MnF3tAhIXTnnYiSbjGc-MM_6GkONpRDsqIq5rCCZV8swFVTOF4CegXLgFCrMtPlbXE_nhTt1tlPrCnLAMD5wZdwIG3wTNU50qL2NQTle1K7XwqYoyFb1rBDZvFUwNoZaAyCvjCAkI6k-WHDd8lzjxblifHqT_z6n49_TIDXtz9YQ8HhxF2mQCn5JHsd0nexvwgc_IbNJ8YV_HN-Nf5pQ29Pa7m8_Z2OPKCruIXZ9k1VKsdjanZ2CtphQu8yoCnOcHKTitFN5Dm14U6afZohtQrOktkAah9GJ5QCZXl5_Pr9lQOYEFaaqOSVN4JWVIScSpiFEkpxIepQLOQUwkPdZNN45zROgqeCgr77njtU4SXBTxnOy0iza-INTVkRcRy8IUUZah8g49kqh1hNdoI0fk_YqjNgyw4ljdYm4hvEDm2zXzR-TtuuuPjKXxt05nOCzrDgh_3d8AobCDUNh_CcWIvMNBtaikQExww14D-CSEu7INmAAEupNA_uFq3O2gvUvLjdIQJpvKjMibdTPoHf5McW1c3Oc-RgqJFJstedkifbul_TbrEbzLfoNwoV_-j499RXZhLEXOSjwkO93P-_gaPKXOH_VK8QDsoA9h
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: Scholars Portal Journals: Open Access
  dbid: M48
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELZKucABUR5ioVQGIcHFkPUjdpAQSilVhYBFKovKybKzdhcpJLCbovLvmUmyUQMVp7wmzsQeZ2Yc-_sIebII6EdFwHENwaSKnrlCZUwh-Al0LlwChbMtPqZHc_nuRJ1skQ3HZl-B60tTO-STmq_K5-c_f7-GDv8KM05I2V-sOS7nBs92hVwFh6SRweGDHH4mcAFpWAcqNBYfuaIWsf_f7_LfcyUvOJ_Dm-RGHzXSvGvmHbIVqlvk-gUswdtkOc-_sK-z97Nf5iXN6fF3V5Zs5nGYhR2Epp1xVVGkPivpPriuBYXDbkgB9rsbKUSwFMqheWuX9NOybnpIa3oMqkFeXa_vkPnh289vjlhPo8AKadKGSZN4JWURowgLEYKITkXcSqWFhwRJeiRRN45zhOtKeDFNveeOZzpKiFfEXbJd1VW4R6jLAk8CcsQkQU6L1DsMT4LWAYrRRk7Is02N2qLHGEeqi9JCroGVb4fKn5DHg-iPDljjMqF9bJZBALGw2xP16tT2XctCSGgKzWMWUy9DoZxOMzeFV4tpkDFxE_IUG9WiDYEyhesXHsArIfaVzcEfIOqdBPV3N-1uN5ZouVEacmaTmgl5NFyGToh_VlwV6rNOxkghUWMzspeR6uMr1bdlC-c9bVcLJ_r-_5_-gFxDqvtu8uEu2W5WZ-EhBESN32vN_Q_sZQfN
  priority: 102
  providerName: Scholars Portal
Title UAV-YOLOv8: A Small-Object-Detection Model Based on Improved YOLOv8 for UAV Aerial Photography Scenarios
URI https://www.proquest.com/docview/2857446868
https://www.proquest.com/docview/2857843490
https://pubmed.ncbi.nlm.nih.gov/PMC10458807
https://doaj.org/article/3248c72f9f6b4ec5a769a173bf6e4f0a
Volume 23
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3db9MwELfY9gIPE5-iMCqDkODFWhI7sbMXlMLKhGCdGEXlKbJdm04qyViz_f3cJW5oAfHSNo3jXOw730fOvyPk5dyhHuUO4xqcidQbpm2asxTBT0C4cAsUZlucZidT8WGWzkLAbRXSKtdrYrtQz2uLMfLDRKUSXBeVqTeXPxlWjcK3q6GExg7Zi0HTIIer8fve4eLgf3VoQhxc-8NVgtu-Y1x-N3RQC9X_94L8Z5LkhtYZ3yX7wVykRTe_98gtV90ndzZABB-QxbT4yr5NPk5u1BEt6PkPvVyyicH4CnvnmjbVqqJY82xJR6Cz5hQOu1gC_O4upGC6UuiHFi1D0rNF3QQsa3oOpIFDXa8ekun4-MvbExbqJzArVNYwoSKTCmG9527OneNepx6_RSq5Ac9IGKyernSSIE5XlNg4MybRSS69AEOFPyK7VV25x4Tq3CWRw-IwkROxzYxGu8RJ6aAbqcSAvF6PaGkDuDjWuFiW4GTg4Jf94A_Ii77pZYeo8a9GI5yWvgGCYLd_1FffyyBTJdiCysrE5z4zwtlUyyzXMTyaz5zwkR6QVzipJYoqEGN12HEAj4SgV2UBigDh7gSQf7Ce9zLI8Kr8zXED8rw_DdKHr1R05errro0SXCDFaotftkjfPlNdLFoc77jdJhzJJ_-_-1NyG2vcd1mHB2S3ubp2z8ASasyQ7MiZHLZMPyR7o-PTs8_DNqoAn5-E-gV92wrD
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELfGeIA9ID5FYYBBIHixlthO7CAhlDGmjpUVaSsqT8ZOnRWpJGPNQPxT_I3c5aNrAfG2p37EdS--8304d78j5OnEox0VHs81BJNR7pjNooRFCH4CmwtLoDDb4iDuj-S7cTReI7-6WhhMq-x0Yq2oJ2WGZ-RbXEcKQhcd69cn3xh2jcKnq10LjUYs9v3PHxCyzV_t7QB_n3G--_boTZ-1XQVYJnVcMakDF0mZ5bnwE-G9yG2U46uMlHAQL0iHPcW15RzRqwKehbFz3PJE5RLMt4B5L5HLYHgDTCFU4_MAT0C816AXCZEEW3OOZeYhqvslm1e3BvjbAPyZlLlk5Xavk2ute0rTRp5ukDVf3CQbS6CFt8h0lH5kn4aD4Xf9kqb08KudzdjQ4XkO2_FVndpVUOyxNqPbYCMnFD42ZxfwvvkhBVeZwjw0rTcA_TAtqxY7mx4CaRDAl_PbZHQhK3uHrBdl4e8SahPPA4_NaAIvwyx2Fv0gr5SHaZSWPfKiW1GTtWDm2FNjZiCowcU3i8XvkSeLoScNgse_Bm0jWxYDEHS7_qI8PTbtHjbge-pM8TzJYyd9FlkVJzaEW8tjL_PA9shzZKpB1QDEZLatcIBbQpAtk4LhQXg9CeRvdnw3rc6Ym3MJ75HHi8uw2_ERji18edaM0VJIpFivyMsK6atXii_TGjc8rMuSA3Xv___-iFzpH70fmMHewf59chU4JpqMx02yXp2e-QfghVXuYS36lHy-6L32G1D9Qeo
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELfGkBA8ID5FYYBBIHixmthO7CAhlFGqjU3rpFHUPQXHtSlSScaagfjX-Ou4y0fXAuJtT_2I6158vi_n7neEPJs6tKPC4bmGYDLyOTM2SliE4CcgXFgChdkWB_HOWL6fRJMN8qurhcG0yk4n1op6Wlo8I-9zHSkIXXSs-75NizgcDN-cfGPYQQqftHbtNJotsud-_oDwbfF6dwC8fs758N2Htzus7TDArNRxxaQO8khK671wU-Gc8Cby-CojJXKIHWSO_cW14RyRrAJuwzjPueGJ8hJMuYB5L5HLSoDZBFlSk_NgT0Ds1yAZCZEE_QXHkvMQVf-K_avbBPxtDP5M0FyxeMMb5HrrqtK02Vs3yYYrbpFrKwCGt8lsnH5kx6P90Xf9iqb06KuZz9kox7MdNnBVneZVUOy3NqfbYC-nFD425xjwvvkhBbeZwjw0rYWBHs7KqsXRpkdAGgTz5eIOGV_Iyt4lm0VZuHuEmsTxwGFjmsDJ0Ma5QZ_IKeVgGqVlj7zsVjSzLbA59teYZxDg4OJny8XvkafLoScNmse_Bm0jW5YDEIC7_qI8_Zy18pyBH6qt4j7xcS6djYyKExPCrfnYSR-YHnmBTM1QTQAx1rTVDnBLCLiVpWCEEGpPAvlbHd-zVn8ssvPd3iNPlpdB8vFxjilcedaM0VJIpFiv7Zc10tevFF9mNYZ4WJcoB-r-___9MbkCUpbt7x7sPSBXgWGiSX7cIpvV6Zl7CA5ZlT-qdz4lny5a1H4DVi9GIA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=UAV-YOLOv8%3A+A+Small-Object-Detection+Model+Based+on+Improved+YOLOv8+for+UAV+Aerial+Photography+Scenarios&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Wang%2C+Gang&rft.au=Chen%2C+Yanfei&rft.au=An%2C+Pei&rft.au=Hanyu+Hong&rft.date=2023-08-01&rft.pub=MDPI+AG&rft.eissn=1424-8220&rft.volume=23&rft.issue=16&rft.spage=7190&rft_id=info:doi/10.3390%2Fs23167190&rft.externalDBID=HAS_PDF_LINK
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon