Automatic detection of oil palm fruits from UAV images using an improved YOLO model
Manual harvesting of loose fruits in the oil palm plantation is both time consuming and physically laborious. Automatic harvesting system is an alternative solution for precision agriculture which requires accurate visual information of the targets. Current state-of-the-art one-stage object detectio...
Saved in:
Published in | The Visual computer Vol. 38; no. 7; pp. 2341 - 2355 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.07.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
ISSN | 0178-2789 1432-2315 |
DOI | 10.1007/s00371-021-02116-3 |
Cover
Loading…
Abstract | Manual harvesting of loose fruits in the oil palm plantation is both time consuming and physically laborious. Automatic harvesting system is an alternative solution for precision agriculture which requires accurate visual information of the targets. Current state-of-the-art one-stage object detection method provides excellent detection accuracy; however, it is computationally intensive and impractical for embedded system. This paper proposed an improved YOLO model to detect oil palm loose fruits from unmanned aerial vehicle images. In order to improve the robustness of the detection system, the images are augmented by brightness, rotation, and blurring to simulate the actual natural environment. The proposed improved YOLO model adopted several improvements; densely connected neural network for better feature reuse, swish activation function, multi-layer detection to enhance detection on small targets and prior box optimization to obtain accurate bounding box information. The experimental results show that the proposed model achieves outstanding average precision of 99.76% with detection time of 34.06 ms. In addition, the proposed model is also light in weight size and requires less training time which is significant in reducing the hardware costs. The results exhibit the superiority of the proposed improved YOLO model over several existing state-of-the-art detection models. |
---|---|
AbstractList | Manual harvesting of loose fruits in the oil palm plantation is both time consuming and physically laborious. Automatic harvesting system is an alternative solution for precision agriculture which requires accurate visual information of the targets. Current state-of-the-art one-stage object detection method provides excellent detection accuracy; however, it is computationally intensive and impractical for embedded system. This paper proposed an improved YOLO model to detect oil palm loose fruits from unmanned aerial vehicle images. In order to improve the robustness of the detection system, the images are augmented by brightness, rotation, and blurring to simulate the actual natural environment. The proposed improved YOLO model adopted several improvements; densely connected neural network for better feature reuse, swish activation function, multi-layer detection to enhance detection on small targets and prior box optimization to obtain accurate bounding box information. The experimental results show that the proposed model achieves outstanding average precision of 99.76% with detection time of 34.06 ms. In addition, the proposed model is also light in weight size and requires less training time which is significant in reducing the hardware costs. The results exhibit the superiority of the proposed improved YOLO model over several existing state-of-the-art detection models. Manual harvesting of loose fruits in the oil palm plantation is both time consuming and physically laborious. Automatic harvesting system is an alternative solution for precision agriculture which requires accurate visual information of the targets. Current state-of-the-art one-stage object detection method provides excellent detection accuracy; however, it is computationally intensive and impractical for embedded system. This paper proposed an improved YOLO model to detect oil palm loose fruits from unmanned aerial vehicle images. In order to improve the robustness of the detection system, the images are augmented by brightness, rotation, and blurring to simulate the actual natural environment. The proposed improved YOLO model adopted several improvements; densely connected neural network for better feature reuse, swish activation function, multi-layer detection to enhance detection on small targets and prior box optimization to obtain accurate bounding box information. The experimental results show that the proposed model achieves outstanding average precision of 99.76% with detection time of 34.06 ms. In addition, the proposed model is also light in weight size and requires less training time which is significant in reducing the hardware costs. The results exhibit the superiority of the proposed improved YOLO model over several existing state-of-the-art detection models. |
Author | Junos, Mohamad Haniff Thannirmalai, Subbiah Dahari, Mahidzal Mohd Khairuddin, Anis Salwa |
Author_xml | – sequence: 1 givenname: Mohamad Haniff surname: Junos fullname: Junos, Mohamad Haniff organization: Department of Electrical Engineering, Faculty of Engineering, Universiti Malaya – sequence: 2 givenname: Anis Salwa surname: Mohd Khairuddin fullname: Mohd Khairuddin, Anis Salwa email: anissalwa@um.edu.my organization: Department of Electrical Engineering, Faculty of Engineering, Universiti Malaya – sequence: 3 givenname: Subbiah surname: Thannirmalai fullname: Thannirmalai, Subbiah organization: Advanced Technologies and Robotics, Sime Darby Technology Centre Sdn Bhd – sequence: 4 givenname: Mahidzal surname: Dahari fullname: Dahari, Mahidzal organization: Department of Electrical Engineering, Faculty of Engineering, Universiti Malaya |
BookMark | eNp9kEtLAzEUhYNUsFb_gKuA69E8ppPJshRfUOhCK7gKmTxKysykJhnBf2_aEQQXXVwuF85377nnEkx63xsAbjC6wwix-4gQZbhA5Fi4KugZmOKSkoJQPJ-AKcKsLgir-QW4jHGH8sxKPgWviyH5TianoDbJqOR8D72F3rVwL9sO2jC4FHPzHdws3qHr5NZEOETXb6Hs87wP_sto-LFerWHntWmvwLmVbTTXv30GNo8Pb8vnYrV-elkuVoWimKfC2LKihGhSNdbShihEa601t5jpRiFGFUdSKT63Zc05qlBVamoabImaN7Y2dAZux73ZwedgYhI7P4Q-nxSEY8brklGWVfWoUsHHGIwVyiV5-DMF6VqBkThEKMYIRY5PHCMUNKPkH7oP-f_wfRqiIxSzuN-a8OfqBPUDBPOFXQ |
CitedBy_id | crossref_primary_10_3390_s23063286 crossref_primary_10_5937_PoljTeh2201015A crossref_primary_10_1109_ACCESS_2022_3192467 crossref_primary_10_3390_electronics14030498 crossref_primary_10_1007_s12541_023_00911_7 crossref_primary_10_1007_s40313_023_01023_3 crossref_primary_10_3389_fpubh_2022_907280 crossref_primary_10_3390_agriculture14010029 crossref_primary_10_1007_s13218_023_00826_5 crossref_primary_10_3390_su142315892 crossref_primary_10_1007_s11760_023_02710_z crossref_primary_10_1007_s00371_023_02895_x crossref_primary_10_1080_2150704X_2023_2299268 crossref_primary_10_1016_j_compag_2023_108219 crossref_primary_10_3390_drones7030190 crossref_primary_10_1007_s11694_024_03001_y crossref_primary_10_1109_JSTARS_2024_3465554 crossref_primary_10_1007_s11760_023_02736_3 crossref_primary_10_3390_s22134704 crossref_primary_10_1109_TCE_2023_3330788 crossref_primary_10_3390_plants12173067 crossref_primary_10_1016_j_atech_2024_100625 crossref_primary_10_1007_s00371_022_02425_1 crossref_primary_10_1117_1_JEI_33_1_013023 crossref_primary_10_1007_s00371_022_02443_z crossref_primary_10_1109_ACCESS_2024_3419835 crossref_primary_10_1007_s10044_024_01222_x crossref_primary_10_3390_agronomy15010145 crossref_primary_10_1016_j_heliyon_2025_e42525 crossref_primary_10_3390_electronics12204293 crossref_primary_10_3389_fpls_2021_791256 crossref_primary_10_1109_ACCESS_2024_3516484 crossref_primary_10_3390_s22176611 crossref_primary_10_1016_j_engappai_2025_110401 crossref_primary_10_3390_jimaging10120309 crossref_primary_10_1007_s00371_023_02877_z crossref_primary_10_1007_s00371_024_03342_1 crossref_primary_10_3390_rs14195063 crossref_primary_10_1007_s00371_023_03092_6 crossref_primary_10_1063_5_0222140 crossref_primary_10_1109_ACCESS_2023_3285537 crossref_primary_10_3390_rs15164017 crossref_primary_10_1007_s00371_022_02756_z crossref_primary_10_1088_2631_8695_adb00f crossref_primary_10_3390_agriculture13112144 crossref_primary_10_1016_j_image_2025_117284 crossref_primary_10_1109_ACCESS_2024_3446890 crossref_primary_10_1007_s00521_024_10300_4 crossref_primary_10_1142_S0218001423500027 |
Cites_doi | 10.1049/iet-ipr.2018.6449 10.3389/fpls.2020.00001 10.1109/CVPR.2016.91 10.1007/s00371-019-01775-7 10.1016/j.imavis.2019.04.007 10.1109/CVPR.2017.690 10.3390/s20072145 10.1007/s11263-019-01247-4 10.1109/LRA.2017.2651944 10.1007/s11119-019-09642-0 10.3390/s20071861 10.1109/TPAMI.2018.2858826 10.1016/j.compbiomed.2020.103792 10.1016/j.conbuildmat.2020.119096 10.1007/s11119-016-9458-5 10.1007/978-3-319-10602-1_48 10.1016/j.compag.2016.07.023 10.1109/TNNLS.2018.2876865 10.1016/j.compag.2019.01.012 10.1109/CVPR.2017.243 10.1049/iet-ipr.2018.6524 10.1109/CVPR42600.2020.01079 10.1016/j.imavis.2019.04.003 10.1109/ICCVW.2019.00011 10.1504/IJCVR.2012.046419 10.3390/s16081222 10.3390/s140712191 10.1016/j.compag.2016.06.022 10.3390/app9183781 10.1007/978-3-319-46448-0_2 10.1016/j.compind.2018.03.010 10.1016/j.compag.2019.05.016 10.1109/CVPR.2014.81 10.1109/ACCESS.2019.2939201 10.1109/ICCV.2015.169 10.1007/s00371-018-01617-y 10.1109/ICCV.2017.322 10.3390/s16111915 |
ContentType | Journal Article |
Copyright | The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021. |
Copyright_xml | – notice: The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021 – notice: The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021. |
DBID | AAYXX CITATION 8FE 8FG AFKRA ARAPS AZQEC BENPR BGLVJ CCPQU DWQXO GNUQQ HCIFZ JQ2 K7- P5Z P62 PHGZM PHGZT PKEHL PQEST PQGLB PQQKQ PQUKI PRINS |
DOI | 10.1007/s00371-021-02116-3 |
DatabaseName | CrossRef ProQuest SciTech Collection ProQuest Technology Collection ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One Community College ProQuest Central Korea ProQuest Central Student SciTech Premium Collection ProQuest Computer Science Collection Computer Science Database Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China |
DatabaseTitle | CrossRef Advanced Technologies & Aerospace Collection Computer Science Database ProQuest Central Student Technology Collection ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection ProQuest One Academic Eastern Edition SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central Advanced Technologies & Aerospace Database ProQuest One Applied & Life Sciences ProQuest One Academic UKI Edition ProQuest Central Korea ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) |
DatabaseTitleList | Advanced Technologies & Aerospace Collection |
Database_xml | – sequence: 1 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Computer Science |
EISSN | 1432-2315 |
EndPage | 2355 |
ExternalDocumentID | 10_1007_s00371_021_02116_3 |
GeographicLocations | Malaysia |
GeographicLocations_xml | – name: Malaysia |
GrantInformation_xml | – fundername: faculty grant grantid: GPF042A-2019 |
GroupedDBID | -4Z -59 -5G -BR -EM -Y2 -~C -~X .86 .DC .VR 06D 0R~ 0VY 123 1N0 1SB 2.D 203 28- 29R 2J2 2JN 2JY 2KG 2KM 2LR 2P1 2VQ 2~H 30V 4.4 406 408 409 40D 40E 5QI 5VS 67Z 6NX 6TJ 78A 8TC 8UJ 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAOBN AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYOK AAYQN AAYTO AAYZH ABAKF ABBBX ABBXA ABDPE ABDZT ABECU ABFTV ABHLI ABHQN ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABWNU ABXPI ACAOD ACBXY ACDTI ACGFS ACHSB ACHXU ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACZOJ ADHHG ADHIR ADIMF ADINQ ADKNI ADKPE ADQRH ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFEXP AFFNX AFGCZ AFKRA AFLOW AFQWF AFWTZ AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHSBF AHYZX AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARAPS ARMRJ ASPBG AVWKF AXYYD AYJHY AZFZN B-. BA0 BBWZM BDATZ BENPR BGLVJ BGNMA BSONS CAG CCPQU COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP DU5 EBLON EBS EIOEI EJD ESBYG FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNWQR GQ6 GQ7 GQ8 GXS H13 HCIFZ HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I09 IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ K7- KDC KOV KOW LAS LLZTM M4Y MA- N2Q N9A NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM P19 P2P P9O PF0 PT4 PT5 QOK QOS R4E R89 R9I RHV RIG RNI RNS ROL RPX RSV RZK S16 S1Z S26 S27 S28 S3B SAP SCJ SCLPG SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TN5 TSG TSK TSV TUC U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR YOT Z45 Z5O Z7R Z7S Z7X Z7Z Z83 Z86 Z88 Z8M Z8N Z8R Z8T Z8W Z92 ZMTXR ~EX AAPKM AAYXX ABBRH ABDBE ABFSG ACSTC ADHKG ADKFA AEZWR AFDZB AFHIU AFOHR AGQPQ AHPBZ AHWEU AIXLP ATHPR AYFIA CITATION PHGZM PHGZT 8FE 8FG ABRTQ AZQEC DWQXO GNUQQ JQ2 P62 PKEHL PQEST PQGLB PQQKQ PQUKI PRINS |
ID | FETCH-LOGICAL-c319t-ef46322d26bff3b2c038ddd9f17dbc073c90acc95f489906064d3eb1f2c5bf8e3 |
IEDL.DBID | BENPR |
ISSN | 0178-2789 |
IngestDate | Sun Jul 20 06:41:34 EDT 2025 Tue Jul 01 01:05:50 EDT 2025 Thu Apr 24 23:02:36 EDT 2025 Fri Feb 21 02:44:17 EST 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 7 |
Keywords | Deep learning Precision agriculture Machine vision Improved YOLO Object detection |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c319t-ef46322d26bff3b2c038ddd9f17dbc073c90acc95f489906064d3eb1f2c5bf8e3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
PQID | 2917984737 |
PQPubID | 2043737 |
PageCount | 15 |
ParticipantIDs | proquest_journals_2917984737 crossref_citationtrail_10_1007_s00371_021_02116_3 crossref_primary_10_1007_s00371_021_02116_3 springer_journals_10_1007_s00371_021_02116_3 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-07-01 |
PublicationDateYYYYMMDD | 2022-07-01 |
PublicationDate_xml | – month: 07 year: 2022 text: 2022-07-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | Berlin/Heidelberg |
PublicationPlace_xml | – name: Berlin/Heidelberg – name: Heidelberg |
PublicationSubtitle | International Journal of Computer Graphics |
PublicationTitle | The Visual computer |
PublicationTitleAbbrev | Vis Comput |
PublicationYear | 2022 |
Publisher | Springer Berlin Heidelberg Springer Nature B.V |
Publisher_xml | – name: Springer Berlin Heidelberg – name: Springer Nature B.V |
References | TianYYangGWangZWangHLiELiangZApple detection during different growth stages in orchards using the improved YOLO-V3 modelComput. Electron. Agric.201915741742610.1016/j.compag.2019.01.012 TianYYangGWangZLiELiangZDetection of apple lesions in orchards based on deep learning methods of CycleGAN and YOLOV3-DenseJ. Sensors.20192019113 HamzaRChtourouMDesign of fuzzy inference system for apple ripeness estimation using gradient methodIET Image Process.20201456156910.1049/iet-ipr.2018.6524 SaIGeZDayoubFUpcroftBPerezTMcCoolCDeepfruits: a fruit detection system using deep neural networksSensors201616812210.3390/s16081222 LiuLOuyangWWangXFieguthPChenJLiuXPietikäinenMDeep learning for generic object detection: A surveyInt. J. Comput. Vis.202012826131810.1007/s11263-019-01247-4 JiaoLZhangFLiuFYangSLiLFengZQuRA survey of deep learning-based object detectionIEEE Access.2019712883712886810.1109/ACCESS.2019.2939201 DiasPATabbAMedeirosHApple flower detection using deep convolutional networksComput. Ind.201899172810.1016/j.compind.2018.03.010 ZhaoYGongLHuangYLiuCA review of key techniques of vision-based control for harvesting robotComput. Electron. Agric.201612731132310.1016/j.compag.2016.06.022 QureshiWSPayneAWalshKBLinkerRCohenODaileyMNMachine vision for counting fruit on mango tree canopiesPrecis. Agric.20161822424410.1007/s11119-016-9458-5 Gené-MolaJVilaplanaVRosell-PoloJRMorrosJRRuiz-HidalgoJGregorioEMulti-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilitiesComput. Electron. Agric.201916268969810.1016/j.compag.2019.05.016 OzturkTTaloMYildirimEABalogluUBYildirimORajendra AcharyaUAutomated detection of COVID-19 cases using deep neural networks with X-ray imagesComput. Biol. Med.202012110379210.1016/j.compbiomed.2020.103792 Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: YOLO9000: Better, faster, stronger. In: IEEE conference on Computer Vision and Pattern Recognition, pp. 6517–6525 (2017) Lin, T.Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollár, P., Zitnick, C.L.: Microsoft COCO: common objects in context. In: Computer Vision—ECCV 2014. Lecture Notes in Computer Science, pp. 740–755 (2014) Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C., Berg, A.C.: SSD : single shot multibox detector. In: European Conference on Computer Vision, pp. 21–37 (2016) Girshick, R., Donahue, J., Darrell, T., Malik, J., Berkeley, U.C.: Rich feature hierarchies for accurate object detection and semantic segmentation. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 580–587 (2014) Zhang, P., Zhong, Y., Li, X.: SlimYOLOv3: narrower, faster and better for real-time UAV applications. In: 2019 International Conference on Computer Vision Workshop, pp. 37–45 (2019) ParkSEEemSHJeonHConcrete crack detection and quantification using deep learning and structured lightConstr. Build. Mater.202025211909610.1016/j.conbuildmat.2020.119096 Redmon, J., Farhadi, A.: YOLOv3 : An incremental improvement. In: IEEE Conference on Computer Vision and Pattern Recognition (2018) Lin, T.Y., Goyal, P., Girshick, R., He, K., Dollar, P.: Focal Loss for Dense Object Detection. In: IEEE transactions on pattern analysis and machine intelligence. pp. 318–327 (2020) Zhu, P., Wen, L., Du, D., Bian, X., Hu, Q., Ling, H.: Vision meets drones: past, present and future. In: Computer Vision and Pattern Recognition, pp. 1–20 (2020). arXiv:2001.06303 RenSHeKGirshickRSunJFaster R-CNN : Towards real-time object detection with region proposal networksIEEE Trans. Pattern Anal. Mach. Intell.201736114 ChenSWShivakumarSSDcunhaSDasJOkonEQuCTaylorCJKumarVCounting apples and oranges with deep learning : A data driven approachIEEE Robot. Autom. Lett.2017278178810.1109/LRA.2017.2651944 Tan, M., Pang, R., Le, Q. V.: EfficientDet: Scalable and efficient object detection. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp. 10778–10787 (2020) VillamizarMSanfeliuAMoreno-NoguerFOnline learning and detection of faces with low human supervisionVis. Comput.20193534937010.1007/s00371-018-01617-y ZhaoHZhouYZhangLPengYHuXPengHCaiXMixed YOLOv3-LITE: a lightweight real-time object detection methodSensors2020207186110.3390/s20071861 YamamotoKGuoWYoshiokaYNinomiyaSOn plant detection of intact tomato fruits using image analysis and machine learning methodsSensors2014147121911220610.3390/s140712191 MadeleineSBargotiSUnderwoodJImage based mango fruit detection, localisation and yield estimation using multiple view geometrySensors20161611191510.3390/s16111915 Huang, G., Liu, Z., Maaten, L. van der, Weinberger, K.Q.: Densely connected convolutional networks. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2017) MaironREdanYComputer vision for fruit harvesting robots—state of the art and challenges aheadInt. J. Comput. Vis. Robot.2012343410.1504/IJCVR.2012.046419 XiPGuanHShuCBorgeatLGoubranRAn integrated approach for medical abnormality detection using deep patch convolutional neural networksVis. Comput.2020361869188210.1007/s00371-019-01775-7 Idrees, A.: Malaysia Palm Oil Industry, http://www.mpoc.org.my/Malaysian_Palm_Oil_Industry.aspx, accessed 15 September 2020 MinWLiXWangQZengQLiaoYNew approach to vehicle license plate location based on new model YOLO-L and plate pre-identificationIET Image Process.2019131041104910.1049/iet-ipr.2018.6449 ChenYLeeWSGanHPeresNFraisseCZhangYHeYStrawberry yield prediction based on a deep neural network using high-resolution aerial orthoimagesRemote Sens201911121 KoiralaAWalshKBWangZMcCarthyCDeep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of MangoYOLOPrecis. Agric.2019201107113510.1007/s11119-019-09642-0 LiuJWangXTomato diseases and pests detection based on improved Yolo V3 convolutional neural networkFront. Plant Sci.202011112 MPOC: Malaysian Palm Oil Council, http://www.mpoc.org.my, accessed 15 September 2020 MaldonadoWBarbosaJCAutomatic green fruit counting in orange trees using digital imagesComput. Electron. Agric.201612757258110.1016/j.compag.2016.07.023 LeeEKimDAccurate traffic light detection using deep neural network with focal regression lossImage Vis. Comput.201987243610.1016/j.imavis.2019.04.003 DyrmannMJørgensenRNMidtibyHSRoboWeedSupport - Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural networkAdv. Anim. Precis. Agric.20178842847 He, K., Gkioxari, G., Dollar, P., Girshick, R.: Mask R-CNN. In: Proceedings of the IEEE international conference on computer vision, pp. 2980–2988 (2017) Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once : unified, real-time object detection. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 779–788 (2016) HendryRCAutomatic license plate recognition via sliding-window darknet-YOLO deep learningImage Vis. Comput.201987475610.1016/j.imavis.2019.04.007 Ramachandran, P., Zoph, B., Le, Q. V.: Swish: a self-gated activation function, In: Neural and Evolutionary Computing. pp. 1–12 (2017). arXiv:1710.05941 ChenWHuangHPengSZhouCZhangCYOLO-face: a real-time face detectorVis. Comput.20203719 Han, S., Pool, J., Tran, J., Dally, W.J.: Learning both weights and connections for efficient neural networks. In: Advances in Neural Information in Processing Systems. pp. 1135–1143 (2015). arXiv:1506.02626v3 LiYHanZXuHLiuLLiXZhangKYOLOv3-lite: a lightweight crack detection network for aircraft structure based on depthwise separable convolutionsAppl. Sci.20199378110.3390/app9183781 ZhaoZQZhengPXuSTWuXObject detection with deep learning: a reviewIEEE Trans. Neural Netw. Learn. Syst.2019303212323210.1109/TNNLS.2018.2876865 LiuGNouazeJCMbouembePLTKimJHYOLO-tomato: a robust algorithm for tomato detection based on YOLOv3Sensors2020207214510.3390/s20072145 Girshick, R.: Fast R-CNN. In: IEEE International Conference on Computer Vision Fast, pp. 1440–1448 (2015) T Ozturk (2116_CR25) 2020; 121 2116_CR12 A Koirala (2116_CR38) 2019; 20 2116_CR13 2116_CR15 2116_CR16 W Min (2116_CR28) 2019; 13 SE Park (2116_CR23) 2020; 252 Y Tian (2116_CR41) 2019; 2019 WS Qureshi (2116_CR7) 2016; 18 2116_CR17 2116_CR18 2116_CR19 W Chen (2116_CR27) 2020; 37 Y Chen (2116_CR36) 2019; 11 2116_CR20 2116_CR21 S Madeleine (2116_CR35) 2016; 16 R Hamza (2116_CR8) 2020; 14 Y Tian (2116_CR40) 2019; 157 M Villamizar (2116_CR26) 2019; 35 G Liu (2116_CR42) 2020; 20 I Sa (2116_CR34) 2016; 16 SW Chen (2116_CR31) 2017; 2 J Gené-Mola (2116_CR37) 2019; 162 R Mairon (2116_CR4) 2012; 3 M Dyrmann (2116_CR32) 2017; 8 Y Zhao (2116_CR3) 2016; 127 S Ren (2116_CR14) 2017; 36 H Zhao (2116_CR48) 2020; 20 ZQ Zhao (2116_CR10) 2019; 30 2116_CR2 2116_CR1 PA Dias (2116_CR33) 2018; 99 L Jiao (2116_CR11) 2019; 7 Y Li (2116_CR22) 2019; 9 K Yamamoto (2116_CR5) 2014; 14 E Lee (2116_CR30) 2019; 87 2116_CR43 2116_CR44 2116_CR45 2116_CR46 2116_CR47 2116_CR49 J Liu (2116_CR39) 2020; 11 P Xi (2116_CR24) 2020; 36 RC Hendry (2116_CR29) 2019; 87 L Liu (2116_CR9) 2020; 128 W Maldonado (2116_CR6) 2016; 127 |
References_xml | – reference: RenSHeKGirshickRSunJFaster R-CNN : Towards real-time object detection with region proposal networksIEEE Trans. Pattern Anal. Mach. Intell.201736114 – reference: ZhaoZQZhengPXuSTWuXObject detection with deep learning: a reviewIEEE Trans. Neural Netw. Learn. Syst.2019303212323210.1109/TNNLS.2018.2876865 – reference: Han, S., Pool, J., Tran, J., Dally, W.J.: Learning both weights and connections for efficient neural networks. In: Advances in Neural Information in Processing Systems. pp. 1135–1143 (2015). arXiv:1506.02626v3 – reference: MaldonadoWBarbosaJCAutomatic green fruit counting in orange trees using digital imagesComput. Electron. Agric.201612757258110.1016/j.compag.2016.07.023 – reference: HendryRCAutomatic license plate recognition via sliding-window darknet-YOLO deep learningImage Vis. Comput.201987475610.1016/j.imavis.2019.04.007 – reference: Zhu, P., Wen, L., Du, D., Bian, X., Hu, Q., Ling, H.: Vision meets drones: past, present and future. In: Computer Vision and Pattern Recognition, pp. 1–20 (2020). arXiv:2001.06303 – reference: HamzaRChtourouMDesign of fuzzy inference system for apple ripeness estimation using gradient methodIET Image Process.20201456156910.1049/iet-ipr.2018.6524 – reference: LiuJWangXTomato diseases and pests detection based on improved Yolo V3 convolutional neural networkFront. Plant Sci.202011112 – reference: DyrmannMJørgensenRNMidtibyHSRoboWeedSupport - Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural networkAdv. Anim. Precis. Agric.20178842847 – reference: LiuLOuyangWWangXFieguthPChenJLiuXPietikäinenMDeep learning for generic object detection: A surveyInt. J. Comput. Vis.202012826131810.1007/s11263-019-01247-4 – reference: He, K., Gkioxari, G., Dollar, P., Girshick, R.: Mask R-CNN. In: Proceedings of the IEEE international conference on computer vision, pp. 2980–2988 (2017) – reference: Ramachandran, P., Zoph, B., Le, Q. V.: Swish: a self-gated activation function, In: Neural and Evolutionary Computing. pp. 1–12 (2017). arXiv:1710.05941 – reference: Lin, T.Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollár, P., Zitnick, C.L.: Microsoft COCO: common objects in context. In: Computer Vision—ECCV 2014. Lecture Notes in Computer Science, pp. 740–755 (2014) – reference: Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once : unified, real-time object detection. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 779–788 (2016) – reference: Lin, T.Y., Goyal, P., Girshick, R., He, K., Dollar, P.: Focal Loss for Dense Object Detection. In: IEEE transactions on pattern analysis and machine intelligence. pp. 318–327 (2020) – reference: Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C., Berg, A.C.: SSD : single shot multibox detector. In: European Conference on Computer Vision, pp. 21–37 (2016) – reference: XiPGuanHShuCBorgeatLGoubranRAn integrated approach for medical abnormality detection using deep patch convolutional neural networksVis. Comput.2020361869188210.1007/s00371-019-01775-7 – reference: MPOC: Malaysian Palm Oil Council, http://www.mpoc.org.my, accessed 15 September 2020 – reference: MaironREdanYComputer vision for fruit harvesting robots—state of the art and challenges aheadInt. J. Comput. Vis. Robot.2012343410.1504/IJCVR.2012.046419 – reference: VillamizarMSanfeliuAMoreno-NoguerFOnline learning and detection of faces with low human supervisionVis. Comput.20193534937010.1007/s00371-018-01617-y – reference: KoiralaAWalshKBWangZMcCarthyCDeep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of MangoYOLOPrecis. Agric.2019201107113510.1007/s11119-019-09642-0 – reference: ChenYLeeWSGanHPeresNFraisseCZhangYHeYStrawberry yield prediction based on a deep neural network using high-resolution aerial orthoimagesRemote Sens201911121 – reference: Girshick, R., Donahue, J., Darrell, T., Malik, J., Berkeley, U.C.: Rich feature hierarchies for accurate object detection and semantic segmentation. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 580–587 (2014) – reference: ZhaoHZhouYZhangLPengYHuXPengHCaiXMixed YOLOv3-LITE: a lightweight real-time object detection methodSensors2020207186110.3390/s20071861 – reference: SaIGeZDayoubFUpcroftBPerezTMcCoolCDeepfruits: a fruit detection system using deep neural networksSensors201616812210.3390/s16081222 – reference: LiuGNouazeJCMbouembePLTKimJHYOLO-tomato: a robust algorithm for tomato detection based on YOLOv3Sensors2020207214510.3390/s20072145 – reference: LeeEKimDAccurate traffic light detection using deep neural network with focal regression lossImage Vis. Comput.201987243610.1016/j.imavis.2019.04.003 – reference: ChenSWShivakumarSSDcunhaSDasJOkonEQuCTaylorCJKumarVCounting apples and oranges with deep learning : A data driven approachIEEE Robot. Autom. Lett.2017278178810.1109/LRA.2017.2651944 – reference: DiasPATabbAMedeirosHApple flower detection using deep convolutional networksComput. Ind.201899172810.1016/j.compind.2018.03.010 – reference: TianYYangGWangZLiELiangZDetection of apple lesions in orchards based on deep learning methods of CycleGAN and YOLOV3-DenseJ. Sensors.20192019113 – reference: Girshick, R.: Fast R-CNN. In: IEEE International Conference on Computer Vision Fast, pp. 1440–1448 (2015) – reference: Tan, M., Pang, R., Le, Q. V.: EfficientDet: Scalable and efficient object detection. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp. 10778–10787 (2020) – reference: MinWLiXWangQZengQLiaoYNew approach to vehicle license plate location based on new model YOLO-L and plate pre-identificationIET Image Process.2019131041104910.1049/iet-ipr.2018.6449 – reference: Idrees, A.: Malaysia Palm Oil Industry, http://www.mpoc.org.my/Malaysian_Palm_Oil_Industry.aspx, accessed 15 September 2020 – reference: YamamotoKGuoWYoshiokaYNinomiyaSOn plant detection of intact tomato fruits using image analysis and machine learning methodsSensors2014147121911220610.3390/s140712191 – reference: Gené-MolaJVilaplanaVRosell-PoloJRMorrosJRRuiz-HidalgoJGregorioEMulti-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilitiesComput. Electron. Agric.201916268969810.1016/j.compag.2019.05.016 – reference: ChenWHuangHPengSZhouCZhangCYOLO-face: a real-time face detectorVis. Comput.20203719 – reference: ZhaoYGongLHuangYLiuCA review of key techniques of vision-based control for harvesting robotComput. Electron. Agric.201612731132310.1016/j.compag.2016.06.022 – reference: ParkSEEemSHJeonHConcrete crack detection and quantification using deep learning and structured lightConstr. Build. Mater.202025211909610.1016/j.conbuildmat.2020.119096 – reference: Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: YOLO9000: Better, faster, stronger. In: IEEE conference on Computer Vision and Pattern Recognition, pp. 6517–6525 (2017) – reference: LiYHanZXuHLiuLLiXZhangKYOLOv3-lite: a lightweight crack detection network for aircraft structure based on depthwise separable convolutionsAppl. Sci.20199378110.3390/app9183781 – reference: JiaoLZhangFLiuFYangSLiLFengZQuRA survey of deep learning-based object detectionIEEE Access.2019712883712886810.1109/ACCESS.2019.2939201 – reference: Redmon, J., Farhadi, A.: YOLOv3 : An incremental improvement. In: IEEE Conference on Computer Vision and Pattern Recognition (2018) – reference: OzturkTTaloMYildirimEABalogluUBYildirimORajendra AcharyaUAutomated detection of COVID-19 cases using deep neural networks with X-ray imagesComput. Biol. Med.202012110379210.1016/j.compbiomed.2020.103792 – reference: QureshiWSPayneAWalshKBLinkerRCohenODaileyMNMachine vision for counting fruit on mango tree canopiesPrecis. Agric.20161822424410.1007/s11119-016-9458-5 – reference: Huang, G., Liu, Z., Maaten, L. van der, Weinberger, K.Q.: Densely connected convolutional networks. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2017) – reference: Zhang, P., Zhong, Y., Li, X.: SlimYOLOv3: narrower, faster and better for real-time UAV applications. In: 2019 International Conference on Computer Vision Workshop, pp. 37–45 (2019) – reference: TianYYangGWangZWangHLiELiangZApple detection during different growth stages in orchards using the improved YOLO-V3 modelComput. Electron. Agric.201915741742610.1016/j.compag.2019.01.012 – reference: MadeleineSBargotiSUnderwoodJImage based mango fruit detection, localisation and yield estimation using multiple view geometrySensors20161611191510.3390/s16111915 – volume: 13 start-page: 1041 year: 2019 ident: 2116_CR28 publication-title: IET Image Process. doi: 10.1049/iet-ipr.2018.6449 – volume: 11 start-page: 1 year: 2020 ident: 2116_CR39 publication-title: Front. Plant Sci. doi: 10.3389/fpls.2020.00001 – ident: 2116_CR17 doi: 10.1109/CVPR.2016.91 – volume: 36 start-page: 1869 year: 2020 ident: 2116_CR24 publication-title: Vis. Comput. doi: 10.1007/s00371-019-01775-7 – volume: 87 start-page: 47 year: 2019 ident: 2116_CR29 publication-title: Image Vis. Comput. doi: 10.1016/j.imavis.2019.04.007 – ident: 2116_CR18 doi: 10.1109/CVPR.2017.690 – volume: 20 start-page: 2145 issue: 7 year: 2020 ident: 2116_CR42 publication-title: Sensors doi: 10.3390/s20072145 – volume: 128 start-page: 261 year: 2020 ident: 2116_CR9 publication-title: Int. J. Comput. Vis. doi: 10.1007/s11263-019-01247-4 – volume: 2 start-page: 781 year: 2017 ident: 2116_CR31 publication-title: IEEE Robot. Autom. Lett. doi: 10.1109/LRA.2017.2651944 – volume: 20 start-page: 1107 year: 2019 ident: 2116_CR38 publication-title: Precis. Agric. doi: 10.1007/s11119-019-09642-0 – volume: 11 start-page: 1 year: 2019 ident: 2116_CR36 publication-title: Remote Sens – volume: 20 start-page: 1861 issue: 7 year: 2020 ident: 2116_CR48 publication-title: Sensors doi: 10.3390/s20071861 – ident: 2116_CR20 doi: 10.1109/TPAMI.2018.2858826 – volume: 121 start-page: 103792 year: 2020 ident: 2116_CR25 publication-title: Comput. Biol. Med. doi: 10.1016/j.compbiomed.2020.103792 – volume: 252 start-page: 119096 year: 2020 ident: 2116_CR23 publication-title: Constr. Build. Mater. doi: 10.1016/j.conbuildmat.2020.119096 – ident: 2116_CR19 – volume: 18 start-page: 224 year: 2016 ident: 2116_CR7 publication-title: Precis. Agric. doi: 10.1007/s11119-016-9458-5 – ident: 2116_CR45 doi: 10.1007/978-3-319-10602-1_48 – volume: 127 start-page: 572 year: 2016 ident: 2116_CR6 publication-title: Comput. Electron. Agric. doi: 10.1016/j.compag.2016.07.023 – ident: 2116_CR46 – volume: 36 start-page: 1 year: 2017 ident: 2116_CR14 publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – ident: 2116_CR1 – volume: 30 start-page: 3212 year: 2019 ident: 2116_CR10 publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2018.2876865 – volume: 157 start-page: 417 year: 2019 ident: 2116_CR40 publication-title: Comput. Electron. Agric. doi: 10.1016/j.compag.2019.01.012 – ident: 2116_CR43 doi: 10.1109/CVPR.2017.243 – volume: 14 start-page: 561 year: 2020 ident: 2116_CR8 publication-title: IET Image Process. doi: 10.1049/iet-ipr.2018.6524 – ident: 2116_CR21 doi: 10.1109/CVPR42600.2020.01079 – volume: 8 start-page: 842 year: 2017 ident: 2116_CR32 publication-title: Adv. Anim. Precis. Agric. – volume: 87 start-page: 24 year: 2019 ident: 2116_CR30 publication-title: Image Vis. Comput. doi: 10.1016/j.imavis.2019.04.003 – ident: 2116_CR49 doi: 10.1109/ICCVW.2019.00011 – volume: 3 start-page: 4 year: 2012 ident: 2116_CR4 publication-title: Int. J. Comput. Vis. Robot. doi: 10.1504/IJCVR.2012.046419 – volume: 16 start-page: 122 issue: 8 year: 2016 ident: 2116_CR34 publication-title: Sensors doi: 10.3390/s16081222 – volume: 14 start-page: 12191 issue: 7 year: 2014 ident: 2116_CR5 publication-title: Sensors doi: 10.3390/s140712191 – volume: 127 start-page: 311 year: 2016 ident: 2116_CR3 publication-title: Comput. Electron. Agric. doi: 10.1016/j.compag.2016.06.022 – ident: 2116_CR47 – ident: 2116_CR2 – volume: 9 start-page: 3781 year: 2019 ident: 2116_CR22 publication-title: Appl. Sci. doi: 10.3390/app9183781 – ident: 2116_CR16 doi: 10.1007/978-3-319-46448-0_2 – volume: 99 start-page: 17 year: 2018 ident: 2116_CR33 publication-title: Comput. Ind. doi: 10.1016/j.compind.2018.03.010 – volume: 162 start-page: 689 year: 2019 ident: 2116_CR37 publication-title: Comput. Electron. Agric. doi: 10.1016/j.compag.2019.05.016 – ident: 2116_CR12 doi: 10.1109/CVPR.2014.81 – ident: 2116_CR44 – volume: 37 start-page: 1 year: 2020 ident: 2116_CR27 publication-title: Vis. Comput. – volume: 7 start-page: 128837 year: 2019 ident: 2116_CR11 publication-title: IEEE Access. doi: 10.1109/ACCESS.2019.2939201 – ident: 2116_CR13 doi: 10.1109/ICCV.2015.169 – volume: 35 start-page: 349 year: 2019 ident: 2116_CR26 publication-title: Vis. Comput. doi: 10.1007/s00371-018-01617-y – ident: 2116_CR15 doi: 10.1109/ICCV.2017.322 – volume: 16 start-page: 1915 issue: 11 year: 2016 ident: 2116_CR35 publication-title: Sensors doi: 10.3390/s16111915 – volume: 2019 start-page: 1 year: 2019 ident: 2116_CR41 publication-title: J. Sensors. |
SSID | ssj0017749 |
Score | 2.5456383 |
Snippet | Manual harvesting of loose fruits in the oil palm plantation is both time consuming and physically laborious. Automatic harvesting system is an alternative... |
SourceID | proquest crossref springer |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 2341 |
SubjectTerms | Accuracy Artificial Intelligence Blurring Cameras Classification Computer Graphics Computer Science Datasets Deep learning Embedded systems Fruits Harvesting Image Processing and Computer Vision Methods Multilayers Neural networks Object recognition Original Article Target detection Unmanned aerial vehicles |
SummonAdditionalLinks | – databaseName: SpringerLink Journals (ICM) dbid: U2A link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwELagLDDwKCAKBXlgg0iJ7bzGCFEhBHSAoDJF8QtVapOqSf8_ZzdpAQESQ4Yotoez73yXu-87hC61S_1cqchROiQQoCjf4WHOwJHj0s2pq5lvAM6PT8Fdyu5H_qgBhVVttXubkrSWegV2s-xyjikpgMcLHLqJtnwTu8MpTkmyyh2AQ2OdXg_iI4PzbKAyP6_x9Tpa-5jf0qL2thnso93GTcTJcl8P0IYqumivbcGAG43sop1PfIKH6DlZ1KXlYMVS1bbIqsClxuV4gmf5ZIr1fDGuK2wwJThNXvF4Cuakwqb4_R3nBbybXwxK4rfhwxDbLjlHKB3cvtzcOU3XBEeAOtUgdBaAlkoScK0pJ8KlkZQy1l4ouQCNFrGbCxH7mkGs5UIAwyQFi62J8LmOFD1GnaIs1AnCkuSMEh2wIFLMi2BQbvr_6YiHUomQ9ZDXCi8TDaW46WwxyVZkyFbgGQg7swLPaA9drebMloQaf47ut3uSNcpVZSQ2LGsspGEPXbf7tP78-2qn_xt-hraJATvY4tw-6tTzhToHF6TmF_bEfQDpG9CO priority: 102 providerName: Springer Nature |
Title | Automatic detection of oil palm fruits from UAV images using an improved YOLO model |
URI | https://link.springer.com/article/10.1007/s00371-021-02116-3 https://www.proquest.com/docview/2917984737 |
Volume | 38 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LT-MwEB4t9AIHxLIgyqPygdtutIntPHpCKWpB-2hXu1vUnqLED1SpTUqb_n_GrtvuIsEhiiI7Pow943l-A3CjfRbmSiWe0jFFA0WFXhHnHBW5Qvo58zUPTYHzz370MOTfRuHIOdyWLq1yIxOtoJaVMD7yr7RtoLV4zOLb-bNnukaZ6KprobEHDRTBCRpfjU63_-v3No6Ayo1VgAO0lUzNpyubscVzFq3OMykK-ASRx_6_mnb65qsQqb15esdw5FRGkq73-CN8UOUJHP4DJPgJ_qSrurLgq0Sq2mZXlaTSpJpMyTyfzoherCb1kphiEjJMH8lkhnJkSUzW-xPJS_w2vgUlyXjwY0Bse5xTGPa6f-8ePNcuwRPIRzVSm0fInpJGhdasoMJniZSyrYNYFgJZWbT9XIh2qDkaWT5aLlwyFNWairDQiWJnsF9WpToHImnOGdURjxLFgwQn5abxn06KWCoR8yYEG0plwmGJm5YW02yLgmypmyFlM0vdjDXh8_af-RpJ493ZV5sNyBxXLbPdGWjCl82m7IbfXu3i_dUu4YCaqgabhXsF-_Vipa5R16iLFuwlvfsWNNJep9M37_vx927LHTMcHdL0BToY094 |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEB6l4UA5VDyKmjbAHuAEVu3d9euAUASEhDx6oKnCybX3gSKldto4QvwpfiOzGzuBSuSWgw-Wd1fW7LezM7sz3wC81i7zU6UiR-mQooOifCcLU46GXCbdlLma-ybBeTQOehP-depPG_C7zoUxYZW1TrSKWhbCnJGf09hQa_GQhR8Wt46pGmVuV-sSGmtYDNSvn-iyLd_3P-H8vqG0-_nyY8-pqgo4AuFW4k_xAFEsaZBpzTIqXBZJKWPthTITiHgRu6kQsa85-iIuGvhcMtRomgo_05FiOO4BPOCMxWZFRd0vm1sLNKWsue2hZ2YyTKskHZuqZ7nxHBMQgY8XOOzfjXBr3d67kLX7XPcxHFUGKumsEfUEGip_Co_-oi18Bt86q7KwVK9EqtLGcuWk0KSYzckind8QfbealUtiUlfIpHNFZjeotZbExNj_IGmO7-YkQ0ny_WJ4QWwxnmOY7EWMz6GZF7k6ASJpyhnVAQ8ixb0IG6WmzKCOslAqEfIWeLWkElExl5sCGvNkw7lspZugZBMr3YS14O2mz2LN27GzdbuegKRaw8tki7gWvKsnZfv5_6Od7h7tFTzsXY6GybA_HpzBITX5FDb-tw3N8m6lXqCVU2YvLbQIXO8by38ABBsL4A |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8MwDLZ4SAgOvBGDATlwg4o2SR87TsDEYwwkGIJT1eaBJm3txLr_j5O1GyBA4tBD1SQHO07s2t9ngGPtMj9RKnKUDikGKMp30jDh6Mil0k2Yq7lvAM53neCqy29e_JdPKH5b7V6lJCeYBsPSlBVnQ6nPpsA3yzTnmPICfLzAYfOwiMexZ_Z1lzaneQR0bqwD7GGsZDCfJWzm5zW-Xk0zf_NbitTePK11WC1dRtKc6HgD5lS2CWtVOwZSWucmrHziFtyCx-a4yC0fK5GqsAVXGck1yXt9Mkz6A6Lfx71iRAy-hHSbz6Q3wKNlREwh_BtJMnw3vxuUJK_37XtiO-ZsQ7d1-XR-5ZQdFByBsihQATxAi5U0SLVmKRUui6SUDe2FMhVo3aLhJkI0fM0x7nIxmOGS4emtqfBTHSm2AwtZnqldIJImnFEd8CBS3ItwUGJ6AeooDaUSIa-BVwkvFiW9uOly0Y-nxMhW4DEKO7YCj1kNTqZzhhNyjT9H1yudxKWhjWLaMIxrPGRhDU4rPc0-_77a3v-GH8HSw0Urbl93bvdhmRoMhK3ZrcNC8T5WB-iZFOmh3XwfaUfXvQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Automatic+detection+of+oil+palm+fruits+from+UAV+images+using+an+improved+YOLO+model&rft.jtitle=The+Visual+computer&rft.au=Junos%2C+Mohamad+Haniff&rft.au=Mohd+Khairuddin%2C+Anis+Salwa&rft.au=Thannirmalai%2C+Subbiah&rft.au=Dahari%2C+Mahidzal&rft.date=2022-07-01&rft.issn=0178-2789&rft.eissn=1432-2315&rft.volume=38&rft.issue=7&rft.spage=2341&rft.epage=2355&rft_id=info:doi/10.1007%2Fs00371-021-02116-3&rft.externalDBID=n%2Fa&rft.externalDocID=10_1007_s00371_021_02116_3 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0178-2789&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0178-2789&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0178-2789&client=summon |