Disclaimer: This material is being kept online for historical purposes. Though accurate at the time of publication, it is no longer being updated. The page may contain broken links or outdated information, and parts may not function in current web browsers. Visit https://espo.nasa.gov for information about our current projects.

 

Fluid Lensing and Machine Learning for Automated Centimeter-Resolution...

Chirayath, Dr. V., and Instrella (2019), Fluid Lensing and Machine Learning for Automated Centimeter-Resolution Airborne Assessment of Coral Reefs in American Samoa without Ocean Wave Distortion, Remote Sensing of Environment, 235, 111475, doi:10.1016/j.rse.2019.111475.
Abstract: 

A novel NASA remote sensing technique, airborne fluid lensing, has enabled cm-resolution multispectral 3D remote sensing of aquatic systems, without adverse refractive distortions from ocean waves. In 2013, a drone- based airborne fluid lensing campaign conducted over the coral reef of Ofu Island, American Samoa, revealed complex 3D morphological, ecological, and bathymetric diversity at the cm-scale over a regional area. In this paper, we develop and validate supervised machine learning algorithm products tailored for accurate automated segmentation of coral reefs using airborne fluid lensing multispectral 3D imagery. Results show that airborne fluid lensing can significantly improve the accuracy of coral habitat mapping using remote sensing.

The machine learning algorithm is based on multidimensional naïve-Bayes maximum a posteriori (MAP) estimation. Provided a user-selected training subset of 3D multispectral images, comprising ~1% of the total dataset, the algorithm separates living structure from nonliving structure and segments the coral reef into four distinct morphological classes – branching coral, mounding coral, basalt rock, and sand. The user-selected training data and algorithm classification results are created and verified, respectively, with sub-cm-resolution ground-truth maps, manually generated from extensive in-situ mapping, underwater gigapixel photogrammetry, and visual inspection of the 3D dataset with subject matter experts.

The algorithm generates 3D cm-resolution data products such as living structure and morphology distribution for the Ofu Island coral reef ecosystem with 95% and 92% accuracy, respectively. By comparison, classification of m-resolution remote sensing imagery, representative of the effective spatial resolution of commonly-used airborne and spaceborne aquatic remote sensing instruments subject to ocean wave distortion, typically pro- duces data products with 68% accuracy. These results suggest existing methodologies may not resolve coral reef ecosystems in sufficient detail for accurate determination of percent cover of living structure and morphology breakdown.

The methods presented here offer a new remote sensing approach enabling repeatable quantitative ecosystem assessment of aquatic systems, independent of ocean wave distortion and sea state. Aquatic remote sensing imagery, free from refractive distortion, appears necessary for accurate and quantitative health assessment capabilities for coral reef ecosystems at the cm-scale, over regional areas. The accurate and automated de- termination of percent cover and morphology distribution at cm-resolution may lead to a significantly improved understanding of reef ecosystem dynamics and responses in a rapidly-changing global climate.

PDF of Publication: 
Download from publisher's website.