Ultrahigh-resolution boreal forest canopy mapping: Combining UAV imagery and photogrammetric point clouds in a deep-learning-based approach
•Self-supervised deep learning for forest canopy mapping using UAVs is proposed.•2D images and 3D SfM point clouds are combined for automated training set generation.•The method is evaluated by annotated images and classical segmentation algorithms.•The method is compared with UAV LiDAR and DCP benc...
Saved in:
Published in | International journal of applied earth observation and geoinformation Vol. 107; p. 102686 |
---|---|
Main Authors | , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.03.2022
Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | •Self-supervised deep learning for forest canopy mapping using UAVs is proposed.•2D images and 3D SfM point clouds are combined for automated training set generation.•The method is evaluated by annotated images and classical segmentation algorithms.•The method is compared with UAV LiDAR and DCP benchmarking methods.•The method shows better consistency at varying image overlaps compared with UAV SfM.
Accurate wall-to-wall estimation of forest crown cover is critical for a wide range of ecological studies. Notwithstanding the increasing use of UAVs in forest canopy mapping, the ultrahigh-resolution UAV imagery requires an appropriate procedure to separate the contribution of understorey from overstorey vegetation, which is complicated by the spectral similarity between the two forest components and the illumination environment. In this study, we investigated the integration of deep learning and the combined data of imagery and photogrammetric point clouds for boreal forest canopy mapping. The procedure enables the automatic creation of training sets of tree crown (overstorey) and background (understorey) data via the combination of UAV images and their associated photogrammetric point clouds and expands the applicability of deep learning models with self-supervision.
Based on the UAV images with different overlap levels of 12 conifer forest plots that are categorized into “I”, “II” and “III” complexity levels according to illumination environment, we compared the self-supervised deep learning-predicted canopy maps from original images with manual delineation data and found an average intersection of union (IoU) larger than 0.9 for “complexity I” and “complexity II” plots and larger than 0.75 for “complexity III” plots. The proposed method was then compared with three classical image segmentation methods (i.e., maximum likelihood, Kmeans, and Otsu) in the plot-level crown cover estimation, showing outperformance in overstorey canopy extraction against other methods. The proposed method was also validated against wall-to-wall and pointwise crown cover estimates using UAV LiDAR and in situ digital cover photography (DCP) benchmarking methods. The results showed that the model-predicted crown cover was in line with the UAV LiDAR method (RMSE of 0.06) and deviate from the DCP method (RMSE of 0.18). We subsequently compared the new method and the commonly used UAV structure-from-motion (SfM) method at varying forward and lateral overlaps over all plots and a rugged terrain region, yielding results showing that the method-predicted crown cover was relatively insensitive to varying overlap (largest bias of less than 0.15), whereas the UAV SfM-estimated crown cover was seriously affected by overlap and decreased with decreasing overlap. In addition, canopy mapping over rugged terrain verified the merits of the new method, with no need for a detailed digital terrain model (DTM). The new method is recommended to be used in various image overlaps, illuminations, and terrains due to its robustness and high accuracy. This study offers opportunities to promote forest ecological applications (e.g., leaf area index estimation) and sustainable management (e.g., deforestation). |
---|---|
AbstractList | Accurate wall-to-wall estimation of forest crown cover is critical for a wide range of ecological studies. Notwithstanding the increasing use of UAVs in forest canopy mapping, the ultrahigh-resolution UAV imagery requires an appropriate procedure to separate the contribution of understorey from overstorey vegetation, which is complicated by the spectral similarity between the two forest components and the illumination environment. In this study, we investigated the integration of deep learning and the combined data of imagery and photogrammetric point clouds for boreal forest canopy mapping. The procedure enables the automatic creation of training sets of tree crown (overstorey) and background (understorey) data via the combination of UAV images and their associated photogrammetric point clouds and expands the applicability of deep learning models with self-supervision.Based on the UAV images with different overlap levels of 12 conifer forest plots that are categorized into “I”, “II” and “III” complexity levels according to illumination environment, we compared the self-supervised deep learning-predicted canopy maps from original images with manual delineation data and found an average intersection of union (IoU) larger than 0.9 for “complexity I” and “complexity II” plots and larger than 0.75 for “complexity III” plots. The proposed method was then compared with three classical image segmentation methods (i.e., maximum likelihood, Kmeans, and Otsu) in the plot-level crown cover estimation, showing outperformance in overstorey canopy extraction against other methods. The proposed method was also validated against wall-to-wall and pointwise crown cover estimates using UAV LiDAR and in situ digital cover photography (DCP) benchmarking methods. The results showed that the model-predicted crown cover was in line with the UAV LiDAR method (RMSE of 0.06) and deviate from the DCP method (RMSE of 0.18). We subsequently compared the new method and the commonly used UAV structure-from-motion (SfM) method at varying forward and lateral overlaps over all plots and a rugged terrain region, yielding results showing that the method-predicted crown cover was relatively insensitive to varying overlap (largest bias of less than 0.15), whereas the UAV SfM-estimated crown cover was seriously affected by overlap and decreased with decreasing overlap. In addition, canopy mapping over rugged terrain verified the merits of the new method, with no need for a detailed digital terrain model (DTM). The new method is recommended to be used in various image overlaps, illuminations, and terrains due to its robustness and high accuracy. This study offers opportunities to promote forest ecological applications (e.g., leaf area index estimation) and sustainable management (e.g., deforestation). •Self-supervised deep learning for forest canopy mapping using UAVs is proposed.•2D images and 3D SfM point clouds are combined for automated training set generation.•The method is evaluated by annotated images and classical segmentation algorithms.•The method is compared with UAV LiDAR and DCP benchmarking methods.•The method shows better consistency at varying image overlaps compared with UAV SfM. Accurate wall-to-wall estimation of forest crown cover is critical for a wide range of ecological studies. Notwithstanding the increasing use of UAVs in forest canopy mapping, the ultrahigh-resolution UAV imagery requires an appropriate procedure to separate the contribution of understorey from overstorey vegetation, which is complicated by the spectral similarity between the two forest components and the illumination environment. In this study, we investigated the integration of deep learning and the combined data of imagery and photogrammetric point clouds for boreal forest canopy mapping. The procedure enables the automatic creation of training sets of tree crown (overstorey) and background (understorey) data via the combination of UAV images and their associated photogrammetric point clouds and expands the applicability of deep learning models with self-supervision. Based on the UAV images with different overlap levels of 12 conifer forest plots that are categorized into “I”, “II” and “III” complexity levels according to illumination environment, we compared the self-supervised deep learning-predicted canopy maps from original images with manual delineation data and found an average intersection of union (IoU) larger than 0.9 for “complexity I” and “complexity II” plots and larger than 0.75 for “complexity III” plots. The proposed method was then compared with three classical image segmentation methods (i.e., maximum likelihood, Kmeans, and Otsu) in the plot-level crown cover estimation, showing outperformance in overstorey canopy extraction against other methods. The proposed method was also validated against wall-to-wall and pointwise crown cover estimates using UAV LiDAR and in situ digital cover photography (DCP) benchmarking methods. The results showed that the model-predicted crown cover was in line with the UAV LiDAR method (RMSE of 0.06) and deviate from the DCP method (RMSE of 0.18). We subsequently compared the new method and the commonly used UAV structure-from-motion (SfM) method at varying forward and lateral overlaps over all plots and a rugged terrain region, yielding results showing that the method-predicted crown cover was relatively insensitive to varying overlap (largest bias of less than 0.15), whereas the UAV SfM-estimated crown cover was seriously affected by overlap and decreased with decreasing overlap. In addition, canopy mapping over rugged terrain verified the merits of the new method, with no need for a detailed digital terrain model (DTM). The new method is recommended to be used in various image overlaps, illuminations, and terrains due to its robustness and high accuracy. This study offers opportunities to promote forest ecological applications (e.g., leaf area index estimation) and sustainable management (e.g., deforestation). |
ArticleNumber | 102686 |
Author | Qi, Jianbo Zhou, Jiaxin Huang, Huaguo Li, Linyuan Liu, Shouyang Yan, Guangjian Mu, Xihan Chianucci, Francesco Chen, Ling Jiang, Jingyi |
Author_xml | – sequence: 1 givenname: Linyuan surname: Li fullname: Li, Linyuan email: lilinyuan@bjfu.edu.cn organization: Research Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, 100083 Beijing, China – sequence: 2 givenname: Xihan surname: Mu fullname: Mu, Xihan organization: State Key Laboratory of Remote Sensing Science, Faculty of Geographical Science, Beijing Normal University, 100875 Beijing, China – sequence: 3 givenname: Francesco surname: Chianucci fullname: Chianucci, Francesco organization: CREA-Research Centre for Forestry and Wood, viale S. Margherita 80, 52100 Arezzo, Italy – sequence: 4 givenname: Jianbo surname: Qi fullname: Qi, Jianbo organization: Research Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, 100083 Beijing, China – sequence: 5 givenname: Jingyi surname: Jiang fullname: Jiang, Jingyi organization: Research Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, 100083 Beijing, China – sequence: 6 givenname: Jiaxin orcidid: 0000-0002-8359-5390 surname: Zhou fullname: Zhou, Jiaxin organization: Research Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, 100083 Beijing, China – sequence: 7 givenname: Ling surname: Chen fullname: Chen, Ling organization: Research Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, 100083 Beijing, China – sequence: 8 givenname: Huaguo surname: Huang fullname: Huang, Huaguo organization: Research Center of Forest Management Engineering of State Forestry and Grassland Administration, Beijing Forestry University, 100083 Beijing, China – sequence: 9 givenname: Guangjian surname: Yan fullname: Yan, Guangjian organization: State Key Laboratory of Remote Sensing Science, Faculty of Geographical Science, Beijing Normal University, 100875 Beijing, China – sequence: 10 givenname: Shouyang surname: Liu fullname: Liu, Shouyang organization: Plant Phenomics Research Center, Nanjing Agricultural University, 210095 Nanjing, China |
BookMark | eNp9kc9u1DAQxiNUJNrCA3DzkUsWx0lsB07Vij-VKnFhETdr4oyzjhw72FmkfYa-NF4Clx56mrE1v0_zzXdTXPngsSjeVnRX0Yq_n3YTjDtGGctvxiV_UVxXUrBSMv7zKvct70rZ1OxVcZPSRGklBJfXxePBrRGOdjyWEVNwp9UGT_oQERwxuaSVaPBhOZMZlsX68QPZh7m3PrfkcPeD2BlGjGcCfiDLMaxhjDDPuEaryRKsz7wLpyER6wmQAXEpHUK88GUPCQeSdWMAfXxdvDTgEr75V2-Lw-dP3_dfy4dvX-73dw-lbupuLZnGvuWSmVYbYbDDttND2zNZ18bU0A19I1voaDfI1tCeS6MpF0IwM8iadqy-Le433SHApJaYHcSzCmDV348QRwVxtdqhyjhvGiMY9KxhjQBgEiXDVhjTUMSs9W7TyhZ-nfK11GyTRufAYzglxbhoW1lXVOZRsY3qGFKKaJS2K1zunROwTlVUXaJUk8pRqkuUaosyk9UT8v_SzzEfNwbzJX9bjCppi17jYCPqNVu1z9B_ABFyu1g |
CitedBy_id | crossref_primary_10_3389_ffgc_2024_1300060 crossref_primary_10_1016_j_compag_2024_108959 crossref_primary_10_1002_rse2_336 crossref_primary_10_1002_ppj2_20096 crossref_primary_10_1111_avsc_12797 crossref_primary_10_1515_nleng_2022_0299 crossref_primary_10_1111_2041_210X_14477 crossref_primary_10_1109_JSTARS_2025_3541395 crossref_primary_10_3389_fpls_2023_1139232 |
Cites_doi | 10.1016/j.rse.2013.04.005 10.1016/j.rse.2019.03.025 10.1007/s11119-017-9502-0 10.3390/rs4051392 10.1109/TPAMI.2012.120 10.1016/j.rse.2018.05.016 10.1016/j.agrformet.2018.07.028 10.1016/j.rse.2020.112041 10.3390/rs12020298 10.1016/j.rse.2019.01.010 10.1364/OE.23.007694 10.1016/j.rse.2018.04.023 10.1016/j.agrformet.2014.05.008 10.1016/j.rse.2012.06.012 10.1109/TGRS.2016.2518930 10.1016/j.rse.2019.01.030 10.1016/j.rse.2019.111520 10.1109/JSTARS.2017.2711482 10.1016/j.rse.2018.02.002 10.1016/j.ecolind.2018.08.011 10.1016/j.rse.2017.04.007 10.1016/j.agrformet.2019.01.024 10.1016/j.rse.2018.09.011 10.1016/j.isprsjprs.2020.10.015 10.1007/s40725-019-00094-3 10.3390/rs8060501 10.1002/rse2.146 10.1016/j.agrformet.2018.11.033 10.1016/j.rse.2012.12.015 10.3390/rs8050398 10.1016/j.rse.2019.01.031 10.3390/rs10091397 10.3390/f7030062 |
ContentType | Journal Article |
Copyright | 2022 The Author(s) |
Copyright_xml | – notice: 2022 The Author(s) |
DBID | 6I. AAFTH AAYXX CITATION 7S9 L.6 DOA |
DOI | 10.1016/j.jag.2022.102686 |
DatabaseName | ScienceDirect Open Access Titles Elsevier:ScienceDirect:Open Access CrossRef AGRICOLA AGRICOLA - Academic DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef AGRICOLA AGRICOLA - Academic |
DatabaseTitleList | AGRICOLA |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Environmental Sciences |
EISSN | 1872-826X |
ExternalDocumentID | oai_doaj_org_article_0b6644f72ab24247aa28e82e57ff40ee 10_1016_j_jag_2022_102686 S0303243422000125 |
GroupedDBID | 29J 4.4 5GY 6I. AAFTH AAQXK AAXUO ABFYP ABLST ABQEM ABQYD ABYKQ ACLVX ACRLP ACSBN ADBBV ADMUD AFKWA AFTJW AFXIZ AGYEJ AHEUO AIKHN AJBFU AJOXV AKIFW ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ ASPBG ATOGT AVWKF AZFZN BKOJK BLECG EBS EJD FDB FEDTE FIRID FYGXN GROUPED_DOAJ HVGLF IMUCA KCYFY KOM M41 O-L P-8 P-9 P2P R2- RIG ROL SDF SDG SES SPC SSE SSJ T5K ~02 AAHBH AALRI AATTM AAXKI AAYWO AAYXX ABJNI ABWVN ACRPL ADNMO ADVLN AEIPS AFJKZ AGCQF AGQPQ AGRNS AIIUN AITUG ANKPU APXCP BNPGV CITATION EFJIC SSH 7S9 L.6 EFKBS |
ID | FETCH-LOGICAL-c439t-2ceb5682f5cf7fe9e59cd5b2833ff3a9db485a909d85f0b68fc067772fd830923 |
IEDL.DBID | AIKHN |
ISSN | 1569-8432 |
IngestDate | Wed Aug 27 01:21:44 EDT 2025 Fri Jul 11 05:32:50 EDT 2025 Tue Jul 01 02:15:20 EDT 2025 Thu Apr 24 23:10:08 EDT 2025 Fri Feb 23 02:40:01 EST 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | UAV imagery CNN nDSM SNFP AGL LiDAR DL CHM Self-supervised deep learning MVS UAV Image overlap FOV Canopy mapping IoU DCP OA SfM point cloud Crown cover GSD SfM DTM DSM |
Language | English |
License | This is an open access article under the CC BY-NC-ND license. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c439t-2ceb5682f5cf7fe9e59cd5b2833ff3a9db485a909d85f0b68fc067772fd830923 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ORCID | 0000-0002-8359-5390 |
OpenAccessLink | https://www.sciencedirect.com/science/article/pii/S0303243422000125 |
PQID | 2675583108 |
PQPubID | 24069 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_0b6644f72ab24247aa28e82e57ff40ee proquest_miscellaneous_2675583108 crossref_citationtrail_10_1016_j_jag_2022_102686 crossref_primary_10_1016_j_jag_2022_102686 elsevier_sciencedirect_doi_10_1016_j_jag_2022_102686 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | March 2022 2022-03-00 20220301 2022-03-01 |
PublicationDateYYYYMMDD | 2022-03-01 |
PublicationDate_xml | – month: 03 year: 2022 text: March 2022 |
PublicationDecade | 2020 |
PublicationTitle | International journal of applied earth observation and geoinformation |
PublicationYear | 2022 |
Publisher | Elsevier B.V Elsevier |
Publisher_xml | – name: Elsevier B.V – name: Elsevier |
References | Torres-Sánchez, López-Granados, Borra-Serrano, Peña (b0155) 2018; 19 Bagaram, Giuliarelli, Chirici, Giannetti, Barbati (b0015) 2018; 10 Krizhevsky, A., Sutskever, I., Hinton, G.E., 2012. ImageNet Classification with Deep Convolutional Neural Networks. Chianucci (b0030) 2020 Nyamgeroh, Groen, Weir, Dimov, Zlatanov (b0105) 2018; 95 Sankey, Donager, McVay, Sankey (b0130) 2017; 195 Yan, Hu, Luo, Weiss, Jiang, Mu, Xie, Zhang (b0175) 2019; 265 Fisher, Armston, Goodwin, Scarth (b0050) 2020; 237 Giannetti, Chirici, Gobakken, Næsset, Travaglini, Puliti (b0055) 2018; 213 Kattenborn, Eichel, Wiser, Burrows, Fassnacht, Schmidtlein, Horning, Clerici (b0070) 2020; 6 Petach, Toomey, Aubrecht, Richardson (b0120) 2014; 195–196 White, Tompalski, Coops, Wulder (b0170) 2018; 208 Shi, Skidmore, Wang, Holzwarth, Heiden, Pinnel, Zhu, Heurich (b0145) 2018; 73 Kattenborn, Lopatin, Förster, Braun, Fassnacht (b0075) 2019; 227 Shen, Cao, Coops, Fan, Wu, Liu, Wang, Cao (b0140) 2020; 250 Tang, Song, Zhao, Strahler, Schaaf, Goetz, Huang, Hansen, Dubayah (b0150) 2019; 268 Wallace, Lucieer, Malenovský, Turner, Vopěnka (b0165) 2016; 7 Riihimäki, Luoto, Heiskanen (b0125) 2019; 224 Chianucci, Disperati, Guzzi, Bianchini, Nardino, Lastri, Rindinella, Corona (b0035) 2016; 47 Achanta, Shaji, Smith, Lucchi, Fua, Süsstrunk (b0005) 2012; 34 Iglhaut, Cabo, Puliti, Piermattei, O’Connor, Rosette (b0060) 2019; 5 Malenovský, Homolová, Zurita-Milla, Lukeš, Kaplan, Hanuš, Gastellu-Etchegorry, Schaepman (b0100) 2013; 131 Asner, Knapp, Boardman, Green, Kennedy-Bowdoin, Eastwood, Martin, Anderson, Field (b0010) 2012; 124 Panigada, Rossini, Meroni, Cilia, Busetto, Amaducci, Boschetti, Cogliati, Picchi, Pinto, Marchesi, Colombo (b0115) 2014; 30 Turner, Lucieer, Watson (b0160) 2012; 4 Berra, Gaulton, Barr (b0020) 2019; 223 Ferreira, Féret, Grau, Gastellu-Etchegorry, do Amaral, Shimabukuro, de Souza Filho (b0045) 2018; 211 Pang, Li, Ju, Lu, Jia, Si, Guo, Liu, Li, Liu, Xie, Tan, Dian (b0110) 2016; 8 Zhang, Qi, Wan, Wang, Xie, Wang, Yan, Zhang, Qi, Wan, Wang, Xie, Wang, Yan (b0185) 2016; 8 Zarco-Tejada, Hornero, Beck, Kattenborn, Kempeneers, Hernández-Clemente (b0180) 2019; 223 Li, Chen, Mu, Li, Yan, Xie, Zhang (b0085) 2020; 12 Schiefer, Kattenborn, Frick, Frey, Schall, Koch, Schmidtlein (b0135) 2020; 170 Dandois, Ellis (b0040) 2013; 136 Brell, Rogass, Segl, Bookhagen, Guanter (b0025) 2016; 54 Jay, Baret, Dutartre, Malatesta, Héno, Comar, Weiss, Maupas (b0065) 2019; 231 Li, Mu, Macfarlane, Song, Chen, Yan, Yan (b0090) 2018; 262 Zhang, Zhao, Chen, Chen, Yan, Li, Qi, Wang, Luo, Chu (b0190) 2015; 23 Ma, Su, Guo (b0095) 2017; 10 Chianucci (10.1016/j.jag.2022.102686_b0035) 2016; 47 Li (10.1016/j.jag.2022.102686_b0085) 2020; 12 Sankey (10.1016/j.jag.2022.102686_b0130) 2017; 195 Panigada (10.1016/j.jag.2022.102686_b0115) 2014; 30 Riihimäki (10.1016/j.jag.2022.102686_b0125) 2019; 224 Shen (10.1016/j.jag.2022.102686_b0140) 2020; 250 Shi (10.1016/j.jag.2022.102686_b0145) 2018; 73 10.1016/j.jag.2022.102686_b0080 Turner (10.1016/j.jag.2022.102686_b0160) 2012; 4 Dandois (10.1016/j.jag.2022.102686_b0040) 2013; 136 Li (10.1016/j.jag.2022.102686_b0090) 2018; 262 Tang (10.1016/j.jag.2022.102686_b0150) 2019; 268 Zarco-Tejada (10.1016/j.jag.2022.102686_b0180) 2019; 223 White (10.1016/j.jag.2022.102686_b0170) 2018; 208 Bagaram (10.1016/j.jag.2022.102686_b0015) 2018; 10 Wallace (10.1016/j.jag.2022.102686_b0165) 2016; 7 Berra (10.1016/j.jag.2022.102686_b0020) 2019; 223 Iglhaut (10.1016/j.jag.2022.102686_b0060) 2019; 5 Jay (10.1016/j.jag.2022.102686_b0065) 2019; 231 Petach (10.1016/j.jag.2022.102686_b0120) 2014; 195–196 Nyamgeroh (10.1016/j.jag.2022.102686_b0105) 2018; 95 Schiefer (10.1016/j.jag.2022.102686_b0135) 2020; 170 Zhang (10.1016/j.jag.2022.102686_b0190) 2015; 23 Zhang (10.1016/j.jag.2022.102686_b0185) 2016; 8 Fisher (10.1016/j.jag.2022.102686_b0050) 2020; 237 Pang (10.1016/j.jag.2022.102686_b0110) 2016; 8 Chianucci (10.1016/j.jag.2022.102686_b0030) 2020 Kattenborn (10.1016/j.jag.2022.102686_b0070) 2020; 6 Malenovský (10.1016/j.jag.2022.102686_b0100) 2013; 131 Achanta (10.1016/j.jag.2022.102686_b0005) 2012; 34 Asner (10.1016/j.jag.2022.102686_b0010) 2012; 124 Yan (10.1016/j.jag.2022.102686_b0175) 2019; 265 Ma (10.1016/j.jag.2022.102686_b0095) 2017; 10 Giannetti (10.1016/j.jag.2022.102686_b0055) 2018; 213 Ferreira (10.1016/j.jag.2022.102686_b0045) 2018; 211 Brell (10.1016/j.jag.2022.102686_b0025) 2016; 54 Kattenborn (10.1016/j.jag.2022.102686_b0075) 2019; 227 Torres-Sánchez (10.1016/j.jag.2022.102686_b0155) 2018; 19 |
References_xml | – volume: 211 start-page: 276 year: 2018 end-page: 291 ident: b0045 article-title: Retrieving structural and chemical properties of individual tree crowns in a highly diverse tropical forest with 3D radiative transfer modeling and imaging spectroscopy publication-title: Remote Sens. Environ. – volume: 124 start-page: 454 year: 2012 end-page: 465 ident: b0010 article-title: Carnegie Airborne Observatory-2: Increasing science data dimensionality via high-fidelity multi-sensor fusion publication-title: Remote Sens. Environ. – volume: 73 start-page: 207 year: 2018 end-page: 219 ident: b0145 article-title: Tree species classification using plant functional traits from LiDAR and hyperspectral data publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 54 start-page: 3460 year: 2016 end-page: 3474 ident: b0025 article-title: Improving sensor fusion: a parametric method for the geometric coalignment of airborne hyperspectral and lidar data publication-title: IEEE Trans. Geosci. Remote Sens. – volume: 8 start-page: 398 year: 2016 ident: b0110 article-title: LiCHy: The CAF’s LiDAR, CCD and hyperspectral integrated airborne observation system publication-title: Remote Sens. – volume: 250 start-page: 112041 year: 2020 ident: b0140 article-title: Quantifying vertical profiles of biochemical traits for forest plantation species using advanced remote sensing approaches publication-title: Remote Sens. Environ. – volume: 34 start-page: 2274 year: 2012 end-page: 2282 ident: b0005 article-title: SLIC superpixels compared to state-of-the-art superpixel methods publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – volume: 262 start-page: 379 year: 2018 end-page: 390 ident: b0090 article-title: A half-Gaussian fitting method for estimating fractional vegetation cover of corn crops using unmanned aerial vehicle images publication-title: Agric. For. Meteorol. – volume: 224 start-page: 119 year: 2019 end-page: 132 ident: b0125 article-title: Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data publication-title: Remote Sens. Environ. – volume: 12 start-page: 298 year: 2020 ident: b0085 article-title: Quantifying understory and overstory vegetation cover using UAV-Based RGB imagery in forest plantation publication-title: Remote Sens. – volume: 213 start-page: 195 year: 2018 end-page: 205 ident: b0055 article-title: A new approach with DTM-independent metrics for forest growing stock prediction using UAV photogrammetric data publication-title: Remote Sens. Environ. – volume: 195–196 start-page: 143 year: 2014 end-page: 151 ident: b0120 article-title: Monitoring vegetation phenology using an infrared-enabled security camera publication-title: Agric. For. Meteorol. – volume: 95 start-page: 629 year: 2018 end-page: 636 ident: b0105 article-title: Detection of forest canopy gaps from very high resolution aerial images publication-title: Ecol. Indic. – volume: 4 start-page: 1392 year: 2012 end-page: 1410 ident: b0160 article-title: An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SfM) point clouds publication-title: Remote Sens. – volume: 170 start-page: 205 year: 2020 end-page: 215 ident: b0135 article-title: Mapping forest tree species in high resolution UAV-based RGB-imagery by means of convolutional neural networks publication-title: ISPRS J. Photogramm. Remote Sens. – volume: 6 start-page: 472 year: 2020 end-page: 486 ident: b0070 article-title: Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery publication-title: Remote Sens. Ecol. Conserv. – reference: Krizhevsky, A., Sutskever, I., Hinton, G.E., 2012. ImageNet Classification with Deep Convolutional Neural Networks. – volume: 231 start-page: 110898 year: 2019 ident: b0065 article-title: Exploiting the centimeter resolution of UAV multispectral imagery to improve remote-sensing estimates of canopy structure and biochemistry in sugar beet crops publication-title: Remote Sens. Environ. – volume: 208 start-page: 1 year: 2018 end-page: 14 ident: b0170 article-title: Comparison of airborne laser scanning and digital stereo imagery for characterizing forest canopy gaps in coastal temperate rainforests publication-title: Remote Sens. Environ. – volume: 131 start-page: 85 year: 2013 end-page: 102 ident: b0100 article-title: Retrieval of spruce leaf chlorophyll content from airborne image data using continuum removal and radiative transfer publication-title: Remote Sens. Environ. – volume: 136 start-page: 259 year: 2013 end-page: 276 ident: b0040 article-title: High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision publication-title: Remote Sens. Environ. – volume: 7 start-page: 62 year: 2016 ident: b0165 article-title: Assessment of forest structure using two UAV techniques: a comparison of airborne laser scanning and structure from motion (SfM) point clouds publication-title: Forests – volume: 47 start-page: 60 year: 2016 end-page: 68 ident: b0035 article-title: Estimation of canopy attributes in beech forests using true colour digital images from a small fixed-wing UAV publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 237 start-page: 111520 year: 2020 ident: b0050 article-title: Modelling canopy gap probability, foliage projective cover and crown projective cover from airborne lidar metrics in Australian forests and woodlands publication-title: Remote Sens. Environ. – volume: 8 start-page: 501 year: 2016 ident: b0185 article-title: An easy-to-use airborne LiDAR data filtering method based on cloth simulation publication-title: Remote Sens. – year: 2020 ident: b0030 article-title: An overview of in situ digital canopy photography in forestry publication-title: Can. J. For. Res. – volume: 10 start-page: 4225 year: 2017 end-page: 4236 ident: b0095 article-title: Comparison of canopy cover estimations from airborne LiDAR, aerial imagery, and satellite imagery publication-title: IEEE J Sel. Top. Appl. Earth Obs. Remote Sens. – volume: 23 start-page: 7694 year: 2015 ident: b0190 article-title: Registration of optical imagery and LiDAR data using an inherent geometrical constraint publication-title: Opt. Express – volume: 223 start-page: 320 year: 2019 end-page: 335 ident: b0180 article-title: Chlorophyll content estimation in an open-canopy conifer forest with Sentinel-2A and hyperspectral imagery in the context of forest decline publication-title: Remote Sens. Environ. – volume: 5 start-page: 155 year: 2019 end-page: 168 ident: b0060 article-title: Structure from motion photogrammetry in forestry: a review publication-title: Curr. For. Rep. – volume: 30 start-page: 167 year: 2014 end-page: 178 ident: b0115 article-title: Fluorescence, PRI and canopy temperature for water stress detection in cereal crops publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 227 start-page: 61 year: 2019 end-page: 73 ident: b0075 article-title: UAV data as alternative to field sampling to map woody invasive species based on combined Sentinel-1 and Sentinel-2 data publication-title: Remote Sens. Environ. – volume: 19 start-page: 115 year: 2018 end-page: 133 ident: b0155 article-title: Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards publication-title: Precis. Agric. – volume: 10 start-page: 1397 year: 2018 ident: b0015 article-title: UAV remote sensing for biodiversity monitoring: are forest canopy gaps good covariates? publication-title: Remote Sens. – volume: 265 start-page: 390 year: 2019 end-page: 411 ident: b0175 article-title: Review of indirect optical measurements of leaf area index: Recent advances, challenges, and perspectives publication-title: Agric. For. Meteorol. – volume: 268 start-page: 258 year: 2019 end-page: 268 ident: b0150 article-title: Definition and measurement of tree cover: a comparative analysis of field-, lidar- and landsat-based tree cover estimations in the Sierra national forests, USA publication-title: Agric. For. Meteorol. – volume: 223 start-page: 229 year: 2019 end-page: 242 ident: b0020 article-title: Assessing spring phenology of a temperate woodland: a multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations publication-title: Remote Sens. Environ. – volume: 195 start-page: 30 year: 2017 end-page: 43 ident: b0130 article-title: UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA publication-title: Remote Sens. Environ. – volume: 136 start-page: 259 year: 2013 ident: 10.1016/j.jag.2022.102686_b0040 article-title: High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2013.04.005 – volume: 227 start-page: 61 year: 2019 ident: 10.1016/j.jag.2022.102686_b0075 article-title: UAV data as alternative to field sampling to map woody invasive species based on combined Sentinel-1 and Sentinel-2 data publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2019.03.025 – volume: 30 start-page: 167 year: 2014 ident: 10.1016/j.jag.2022.102686_b0115 article-title: Fluorescence, PRI and canopy temperature for water stress detection in cereal crops publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 19 start-page: 115 issue: 1 year: 2018 ident: 10.1016/j.jag.2022.102686_b0155 article-title: Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards publication-title: Precis. Agric. doi: 10.1007/s11119-017-9502-0 – volume: 4 start-page: 1392 year: 2012 ident: 10.1016/j.jag.2022.102686_b0160 article-title: An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SfM) point clouds publication-title: Remote Sens. doi: 10.3390/rs4051392 – volume: 34 start-page: 2274 issue: 11 year: 2012 ident: 10.1016/j.jag.2022.102686_b0005 article-title: SLIC superpixels compared to state-of-the-art superpixel methods publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2012.120 – volume: 213 start-page: 195 year: 2018 ident: 10.1016/j.jag.2022.102686_b0055 article-title: A new approach with DTM-independent metrics for forest growing stock prediction using UAV photogrammetric data publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2018.05.016 – volume: 262 start-page: 379 year: 2018 ident: 10.1016/j.jag.2022.102686_b0090 article-title: A half-Gaussian fitting method for estimating fractional vegetation cover of corn crops using unmanned aerial vehicle images publication-title: Agric. For. Meteorol. doi: 10.1016/j.agrformet.2018.07.028 – volume: 250 start-page: 112041 year: 2020 ident: 10.1016/j.jag.2022.102686_b0140 article-title: Quantifying vertical profiles of biochemical traits for forest plantation species using advanced remote sensing approaches publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2020.112041 – volume: 12 start-page: 298 year: 2020 ident: 10.1016/j.jag.2022.102686_b0085 article-title: Quantifying understory and overstory vegetation cover using UAV-Based RGB imagery in forest plantation publication-title: Remote Sens. doi: 10.3390/rs12020298 – volume: 223 start-page: 229 year: 2019 ident: 10.1016/j.jag.2022.102686_b0020 article-title: Assessing spring phenology of a temperate woodland: a multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2019.01.010 – volume: 23 start-page: 7694 issue: 6 year: 2015 ident: 10.1016/j.jag.2022.102686_b0190 article-title: Registration of optical imagery and LiDAR data using an inherent geometrical constraint publication-title: Opt. Express doi: 10.1364/OE.23.007694 – volume: 211 start-page: 276 year: 2018 ident: 10.1016/j.jag.2022.102686_b0045 article-title: Retrieving structural and chemical properties of individual tree crowns in a highly diverse tropical forest with 3D radiative transfer modeling and imaging spectroscopy publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2018.04.023 – volume: 195–196 start-page: 143 year: 2014 ident: 10.1016/j.jag.2022.102686_b0120 article-title: Monitoring vegetation phenology using an infrared-enabled security camera publication-title: Agric. For. Meteorol. doi: 10.1016/j.agrformet.2014.05.008 – volume: 124 start-page: 454 year: 2012 ident: 10.1016/j.jag.2022.102686_b0010 article-title: Carnegie Airborne Observatory-2: Increasing science data dimensionality via high-fidelity multi-sensor fusion publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2012.06.012 – volume: 54 start-page: 3460 issue: 6 year: 2016 ident: 10.1016/j.jag.2022.102686_b0025 article-title: Improving sensor fusion: a parametric method for the geometric coalignment of airborne hyperspectral and lidar data publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2016.2518930 – ident: 10.1016/j.jag.2022.102686_b0080 – volume: 224 start-page: 119 year: 2019 ident: 10.1016/j.jag.2022.102686_b0125 article-title: Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2019.01.030 – volume: 237 start-page: 111520 year: 2020 ident: 10.1016/j.jag.2022.102686_b0050 article-title: Modelling canopy gap probability, foliage projective cover and crown projective cover from airborne lidar metrics in Australian forests and woodlands publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2019.111520 – volume: 10 start-page: 4225 issue: 9 year: 2017 ident: 10.1016/j.jag.2022.102686_b0095 article-title: Comparison of canopy cover estimations from airborne LiDAR, aerial imagery, and satellite imagery publication-title: IEEE J Sel. Top. Appl. Earth Obs. Remote Sens. doi: 10.1109/JSTARS.2017.2711482 – volume: 73 start-page: 207 year: 2018 ident: 10.1016/j.jag.2022.102686_b0145 article-title: Tree species classification using plant functional traits from LiDAR and hyperspectral data publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 208 start-page: 1 year: 2018 ident: 10.1016/j.jag.2022.102686_b0170 article-title: Comparison of airborne laser scanning and digital stereo imagery for characterizing forest canopy gaps in coastal temperate rainforests publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2018.02.002 – volume: 95 start-page: 629 year: 2018 ident: 10.1016/j.jag.2022.102686_b0105 article-title: Detection of forest canopy gaps from very high resolution aerial images publication-title: Ecol. Indic. doi: 10.1016/j.ecolind.2018.08.011 – volume: 195 start-page: 30 year: 2017 ident: 10.1016/j.jag.2022.102686_b0130 article-title: UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2017.04.007 – volume: 268 start-page: 258 year: 2019 ident: 10.1016/j.jag.2022.102686_b0150 article-title: Definition and measurement of tree cover: a comparative analysis of field-, lidar- and landsat-based tree cover estimations in the Sierra national forests, USA publication-title: Agric. For. Meteorol. doi: 10.1016/j.agrformet.2019.01.024 – volume: 47 start-page: 60 year: 2016 ident: 10.1016/j.jag.2022.102686_b0035 article-title: Estimation of canopy attributes in beech forests using true colour digital images from a small fixed-wing UAV publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 231 start-page: 110898 year: 2019 ident: 10.1016/j.jag.2022.102686_b0065 article-title: Exploiting the centimeter resolution of UAV multispectral imagery to improve remote-sensing estimates of canopy structure and biochemistry in sugar beet crops publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2018.09.011 – volume: 170 start-page: 205 year: 2020 ident: 10.1016/j.jag.2022.102686_b0135 article-title: Mapping forest tree species in high resolution UAV-based RGB-imagery by means of convolutional neural networks publication-title: ISPRS J. Photogramm. Remote Sens. doi: 10.1016/j.isprsjprs.2020.10.015 – volume: 5 start-page: 155 issue: 3 year: 2019 ident: 10.1016/j.jag.2022.102686_b0060 article-title: Structure from motion photogrammetry in forestry: a review publication-title: Curr. For. Rep. doi: 10.1007/s40725-019-00094-3 – volume: 8 start-page: 501 year: 2016 ident: 10.1016/j.jag.2022.102686_b0185 article-title: An easy-to-use airborne LiDAR data filtering method based on cloth simulation publication-title: Remote Sens. doi: 10.3390/rs8060501 – volume: 6 start-page: 472 issue: 4 year: 2020 ident: 10.1016/j.jag.2022.102686_b0070 article-title: Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery publication-title: Remote Sens. Ecol. Conserv. doi: 10.1002/rse2.146 – year: 2020 ident: 10.1016/j.jag.2022.102686_b0030 article-title: An overview of in situ digital canopy photography in forestry publication-title: Can. J. For. Res. – volume: 265 start-page: 390 year: 2019 ident: 10.1016/j.jag.2022.102686_b0175 article-title: Review of indirect optical measurements of leaf area index: Recent advances, challenges, and perspectives publication-title: Agric. For. Meteorol. doi: 10.1016/j.agrformet.2018.11.033 – volume: 131 start-page: 85 year: 2013 ident: 10.1016/j.jag.2022.102686_b0100 article-title: Retrieval of spruce leaf chlorophyll content from airborne image data using continuum removal and radiative transfer publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2012.12.015 – volume: 8 start-page: 398 year: 2016 ident: 10.1016/j.jag.2022.102686_b0110 article-title: LiCHy: The CAF’s LiDAR, CCD and hyperspectral integrated airborne observation system publication-title: Remote Sens. doi: 10.3390/rs8050398 – volume: 223 start-page: 320 year: 2019 ident: 10.1016/j.jag.2022.102686_b0180 article-title: Chlorophyll content estimation in an open-canopy conifer forest with Sentinel-2A and hyperspectral imagery in the context of forest decline publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2019.01.031 – volume: 10 start-page: 1397 year: 2018 ident: 10.1016/j.jag.2022.102686_b0015 article-title: UAV remote sensing for biodiversity monitoring: are forest canopy gaps good covariates? publication-title: Remote Sens. doi: 10.3390/rs10091397 – volume: 7 start-page: 62 year: 2016 ident: 10.1016/j.jag.2022.102686_b0165 article-title: Assessment of forest structure using two UAV techniques: a comparison of airborne laser scanning and structure from motion (SfM) point clouds publication-title: Forests doi: 10.3390/f7030062 |
SSID | ssj0017768 |
Score | 2.4871845 |
Snippet | •Self-supervised deep learning for forest canopy mapping using UAVs is proposed.•2D images and 3D SfM point clouds are combined for automated training set... Accurate wall-to-wall estimation of forest crown cover is critical for a wide range of ecological studies. Notwithstanding the increasing use of UAVs in forest... |
SourceID | doaj proquest crossref elsevier |
SourceType | Open Website Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 102686 |
SubjectTerms | boreal forests Canopy mapping coniferous forests Crown cover deforestation forest canopy image analysis Image overlap landscapes leaf area index lidar lighting overstory photogrammetry Self-supervised deep learning SfM point cloud spatial data statistical analysis tree crown UAV imagery understory |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELZQT3BAUKhYHpWROCFZZJ04trmVqlWFBCcW9Wb5uWy1m0Td7KG_gT_NTOKUbQ_lwjVyYisznvlGM_MNIR90nCMJimc15v8rLUtmMcnrvJtHDsaw9Nic_O17fbGovl6Ky71RX1gTNtIDjz_uU-FqcNlJcuuwk0Fay1VUPAqZUlXEiNYXfN4UTOX8gZRjE5yoNVNVyad85lDZdWWXEBhyjrQFNTZR73mkgbj_jmO6Z6IHv3P-jDzNgJGejAd9Th7F5pA82aMRPCRHZ3-71WBpvq7bF-T3Yt2DMYH4m0FUnZWMgtQBHVJAq7AdhV_bdjd0Y5GoYfmZgoFww9AIujj5SVcb5Li4obYJtPvV9kMx1wancHnatasG3l-3u7Clq4ZaGmLsWJ5DsWToIAOdSMtfksX52Y_TC5anLzAPIKVn3EcnasWT8EmmqKPQPggHcKRMqbQ6uEoJqwsdlEggJZU8stFJnoIqC8CNR-SgaZv4ilCAVSCyuQyFi5WfJwgqq6SkVl47bFGZkWKSgPGZmhwnZKzNVIN2ZUBoBoVmRqHNyMfbV7qRl-OhxV9QrLcLkVJ7eACKZrKimX8p2oxUk1KYjE5G1AGfWj209_tJgQzcXEzH2Ca2u63hEKsJnPOmXv-P870hj3HbsT7uLTnor3fxHQCm3h0Pd-MPY7ATDg priority: 102 providerName: Directory of Open Access Journals |
Title | Ultrahigh-resolution boreal forest canopy mapping: Combining UAV imagery and photogrammetric point clouds in a deep-learning-based approach |
URI | https://dx.doi.org/10.1016/j.jag.2022.102686 https://www.proquest.com/docview/2675583108 https://doaj.org/article/0b6644f72ab24247aa28e82e57ff40ee |
Volume | 107 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9swDBa69LIdhq1bsewRaMBOA4w4smVLvWVBi-xVDNsy9CZIspS5SGyjSQ79Df3TI205a3foYccY8iMmRX40yY-EvJNugiQoNsow_5_KPIk0JnmNNRPHwBgmFpuTv55n80X66YJfHJBZ3wuDZZXB9nc2vbXW4cg4vM1xU5bjH6CegAaSlLH2cwp_QA4ZeNd4QA6nHz_Pz_fJhDzvOuJ4JiORJqxPbrZlXpd6CVEiY8hhkGFH9S331LL43_FS_9jr1gmdPSGPA3qk0-4Bn5IDVx2RR7c4BY_I8enf1jVYGvbu5hm5Way2YFkgGI8gxA4aR0EFACpSgK5wOwrvuW6u6Voja8PyhIK1MO0ECbqY_qLlGgkvrqmuCtr8rrdtZdcaR3JZ2tRlBeev6l2xoWVFNS2ca6IwlGIZobcsaM9g_pwszk5_zuZRGMUQWUAs24hZZ3gmmOfW595Jx6UtuAFsknifaFmYVHAtY1kI7mOTCW-Rmi5nvhBJDCDymAyqunIvCAWM5QQEQkVsXGonHiLM1ItcCisN9qsMSdxLQNnAU47jMlaqL0i7VCA0hUJTndCG5P3-lKYj6bhv8QcU634h8mu3B-qrpQoKpuAvAFD0OdMG-2dyrZmAp3Y89z6NnRuStFcKdUdd4VLlffd-2yuQgm2MuRlduXq3UQwCN45D38TL_7v0K_IQf3Xlca_JYHu1c28AL23NCPbD7PuXb6OwL0btd4c_0uAWdg |
linkProvider | Elsevier |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF5V7QE4IChUhOciwQXJirN-7SJxCNAqIW0uNKi3ZXe9G1wlttWkQvkN_Bz-IDP2OrQcekDq1V4_4pn9ZiYz8w0hb4QdIAmKCVLM_8ciiwKFSV5t9MAyAMPIYHPyyTQdzeIvZ8nZDvnd9cJgWaXH_hbTG7T2R_r-a_brouh_BfUEbyCKGWv-TukqKyd28xPittWH8WcQ8lvGjg5PP40CP1ogMGCB1wEzVicpZy4xLnNW2ESYPNFgayPnIiVyHfNEiVDkPHGhTrkzSLWWMZfzKBTIdgC4v4dsWLCt9objyWi6TV5kWduBl6Qi4HHEumRqU1Z2ruYQlTKGnAkpdnBfMYfN1IBrVvEf-9AYvaMH5L73Vumw_SAPyY4t98m9KxyG--Tg8G-rHCz1WLF6RH7NFmtAMgj-AwjpvYZTUDlwTSm4yvA4CnKt6g1dKmSJmL-ngE66mVhBZ8NvtFgiwcaGqjKn9Y9q3VSSLXEEmKF1VZRw_aK6zFe0KKmiubV14IdgzAO0zjntGNMfk9mtyOeA7JZVaZ8QCj6d5RB45aG2sRk4iGhjxzPBjdDYH9MjYScBaTwvOo7nWMiuAO5cgtAkCk22QuuRd9tL6pYU5KbFH1Gs24XI590cqC7m0iu0hJ8AjqnLmNLYr5MpxTi8tU0y5-LQ2h6JO6WQ17YH3Kq46dmvOwWSABuYC1KlrS5XkkGgmOCQOf70_279itwZnZ4cy-PxdPKM3MUzbWnec7K7vri0L8BXW-uXfm9Q8v22t-Mf685Rcw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Ultrahigh-resolution+boreal+forest+canopy+mapping%3A+Combining+UAV+imagery+and+photogrammetric+point+clouds+in+a+deep-learning-based+approach&rft.jtitle=International+journal+of+applied+earth+observation+and+geoinformation&rft.au=Li%2C+Linyuan&rft.au=Mu%2C+Xihan&rft.au=Chianucci%2C+Francesco&rft.au=Qi%2C+Jianbo&rft.date=2022-03-01&rft.pub=Elsevier+B.V&rft.issn=1569-8432&rft.eissn=1872-826X&rft.volume=107&rft_id=info:doi/10.1016%2Fj.jag.2022.102686&rft.externalDocID=S0303243422000125 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1569-8432&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1569-8432&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1569-8432&client=summon |