Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery
Unmanned Aerial Vehicles (UAV) greatly extended our possibilities to acquire high resolution remote sensing data for assessing the spatial distribution of species composition and vegetation characteristics. Yet, current pixel‐ or texture‐based mapping approaches do not fully exploit the information...
Saved in:
Published in | Remote sensing in ecology and conservation Vol. 6; no. 4; pp. 472 - 486 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Oxford
John Wiley & Sons, Inc
01.12.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Unmanned Aerial Vehicles (UAV) greatly extended our possibilities to acquire high resolution remote sensing data for assessing the spatial distribution of species composition and vegetation characteristics. Yet, current pixel‐ or texture‐based mapping approaches do not fully exploit the information content provided by the high spatial resolution. Here, to fully harness this spatial detail, we apply deep learning techniques, that is, Convolutional Neural Networks (CNNs), on regular tiles of UAV‐orthoimagery (here 2–5 m) to identify the cover of target plant species and plant communities. The approach was tested with UAV‐based orthomosaics and photogrammetric 3D information in three case studies, that is, (1) mapping tree species cover in primary forests, (2) mapping plant invasions by woody species into forests and open land and (3) mapping vegetation succession in a glacier foreland. All three case studies resulted in high predictive accuracies. The accuracy increased with increasing tile size (2–5 m) reflecting the increased spatial context captured by a tile. The inclusion of 3D information derived from the photogrammetric workflow did not significantly improve the models. We conclude that CNN are powerful in harnessing high resolution data acquired from UAV to map vegetation patterns. The study was based on low cost red, green, blue (RGB) sensors making the method accessible to a wide range of users. Combining UAV and CNN will provide tremendous opportunities for ecological applications.
Unmanned Aerial Vehicles (UAV) have greatly enlarged our possibilities to acquire remote sensing data for vegetation mapping. However, efficient tools are required to efficiently harness the high resolution data acquired this way. Using three case studies, we demonstrate that Deep Learning (Convolutional Neural Networks) will pave new avenues for UAV‐based vegetation mapping. This potential was tested using lowcost UAVs with standard RGB sensors and accordingly this technology will be applicable by a wide range of users. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2056-3485 2056-3485 |
DOI: | 10.1002/rse2.146 |