Effects of Vegetation Cover on the Microwave Radiometric Sensitivity to Soil Moisture
The reduction in sensitivity of the microwave brightness temperature to soil moisture content due to vegetation cover is analyzed using airborne observations made at 1.4 and 5 GHz. The data were acquired during six flights in 1978 over a test site near Colby, Kansas. The test site consisted of bare...
Saved in:
Published in | IEEE transactions on geoscience and remote sensing Vol. GE-21; no. 1; pp. 51 - 61 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
IEEE
01.01.1983
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The reduction in sensitivity of the microwave brightness temperature to soil moisture content due to vegetation cover is analyzed using airborne observations made at 1.4 and 5 GHz. The data were acquired during six flights in 1978 over a test site near Colby, Kansas. The test site consisted of bare soil, wheat stubble, and fully mature corn fields. The results for corn indicate that the radiometric sensitivity to soil moisture S decreases in magnitude with increasing frequency and with increasing angle of incidence (relative to nadir).The sensitivity reduction factor, defined in terms of the radiometric sensitivities for bare soil and canopy-covered conditions Y=1 - Scan/ Ss was found to be equal to 0.65 for normal incidence at 1.4 GHz, and increases to 0.89 at 5 GHz. These results confirm previous conclusions that the presence of vegetation cover may pose a serious problem for soil moisture detection with passive microwave sensors. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.1983.350530 |