Fluid lensing and machine learning for centimeter-resolution airborne assessment of coral reefs in American Samoa
A novel NASA remote sensing technique, airborne fluid lensing, has enabled cm-resolution multispectral 3D remote sensing of aquatic systems, without adverse refractive distortions from ocean waves. In 2013, a drone-based airborne fluid lensing campaign conducted over the coral reef of Ofu Island, Am...
Saved in:
Published in | Remote sensing of environment Vol. 235; p. 111475 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
Elsevier Inc
15.12.2019
Elsevier BV |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | A novel NASA remote sensing technique, airborne fluid lensing, has enabled cm-resolution multispectral 3D remote sensing of aquatic systems, without adverse refractive distortions from ocean waves. In 2013, a drone-based airborne fluid lensing campaign conducted over the coral reef of Ofu Island, American Samoa, revealed complex 3D morphological, ecological, and bathymetric diversity at the cm-scale over a regional area. In this paper, we develop and validate supervised machine learning algorithm products tailored for accurate automated segmentation of coral reefs using airborne fluid lensing multispectral 3D imagery. Results show that airborne fluid lensing can significantly improve the accuracy of coral habitat mapping using remote sensing.
The machine learning algorithm is based on multidimensional naïve-Bayes maximum a posteriori (MAP) estimation. Provided a user-selected training subset of 3D multispectral images, comprising ~1% of the total dataset, the algorithm separates living structure from nonliving structure and segments the coral reef into four distinct morphological classes – branching coral, mounding coral, basalt rock, and sand. The user-selected training data and algorithm classification results are created and verified, respectively, with sub-cm-resolution ground-truth maps, manually generated from extensive in-situ mapping, underwater gigapixel photogrammetry, and visual inspection of the 3D dataset with subject matter experts.
The algorithm generates 3D cm-resolution data products such as living structure and morphology distribution for the Ofu Island coral reef ecosystem with 95% and 92% accuracy, respectively. By comparison, classification of m-resolution remote sensing imagery, representative of the effective spatial resolution of commonly-used airborne and spaceborne aquatic remote sensing instruments subject to ocean wave distortion, typically produces data products with 68% accuracy. These results suggest existing methodologies may not resolve coral reef ecosystems in sufficient detail for accurate determination of percent cover of living structure and morphology breakdown.
The methods presented here offer a new remote sensing approach enabling repeatable quantitative ecosystem assessment of aquatic systems, independent of ocean wave distortion and sea state. Aquatic remote sensing imagery, free from refractive distortion, appears necessary for accurate and quantitative health assessment capabilities for coral reef ecosystems at the cm-scale, over regional areas. The accurate and automated determination of percent cover and morphology distribution at cm-resolution may lead to a significantly improved understanding of reef ecosystem dynamics and responses in a rapidly-changing global climate.
•Airborne Fluid Lensing creates cm-scale 3D images of coral reefs in American Samoa.•A machine learning algorithm is developed to classify coral using these 3D images.•Using the algorithm on cm-scale 3D data, coral cover is measured with 95% accuracy.•The algorithm is validated with cm-resolution maps from in situ reference data. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 0034-4257 1879-0704 |
DOI: | 10.1016/j.rse.2019.111475 |