Fusing optical and SAR time series for LAI gap fillingwith multioutput Gaussian processes
The availability of satellite optical information is often hampered by the natural presence of clouds, which can be problematic for many applications. Persistent clouds over agricultural fields can mask key stages of crop growth, leading to unreliable yield predictions. Synthetic Aperture Radar (SAR...
Saved in:
Published in | Remote sensing of environment Vol. 235 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
15.12.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The availability of satellite optical information is often hampered by the natural presence of clouds, which can be problematic for many applications. Persistent clouds over agricultural fields can mask key stages of crop growth, leading to unreliable yield predictions. Synthetic Aperture Radar (SAR) provides all-weather imagery which can potentially overcome this limitation, but given its high and distinct sensitivity to different surface properties, the fusion of SAR and optical data still remains an open challenge. In this work, we propose the use of Multi-Output Gaussian Process (MOGP) regression, a machine learning technique that learns automatically the statistical relationships among multisensor time series, to detect vegetated areas over which the synergy between SAR-optical imageries is profitable. For this purpose, we use the Sentinel-1 Radar Vegetation Index (RVI) and Sentinel-2 Leaf Area Index (LAI) time series over a study area in north west of the Iberian peninsula. Through a physical interpretation of MOGP trained models, we show its ability to provide estimations of LAI even over cloudy periods using the information shared with RVI, which guarantees the solution keeps always tied to real measurements. Results demonstrate the advantage of MOGP especially for long data gaps, where optical-based methods notoriously fail. The leave-one-image-out assessment technique applied to the whole vegetation cover shows MOGP predictions improve standard GP estimations over short-time gaps (R
of 74% vs 68%, RMSE of 0.4 vs 0.44 [
]) and especially over long-time gaps (R
of 33% vs 12%, RMSE of 0.5 vs 1.09 [
]). A second assessment is focused on crop-specific regions, clustering pixels fulfilling specific model conditions where the synergy is profitable. Results reveal the MOGP performance is crop type and crop stage dependent. For long time gaps, best R
are obtained over maize, ranging from 0.1 (tillering) to 0.36 (development) up to 0.81 (maturity); for moderate time gap, R
= 0.93 (maturity) is obtained. Crops such as wheat, oats, rye and barley, can profit from the LAI-RVI synergy, with R
varying between 0.4 and 0.6. For beet or potatoes, MOGP provides poorer results, but alternative descriptors to RVI should be tested for these specific crops in the future before discarding synergy real benefits. In conclusion, active-passive sensor fusion with MOGP represents a novel and promising approach to cope with crop monitoring over cloud-dominated areas. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0034-4257 1879-0704 |
DOI: | 10.1016/j.rse.2019.111452 |