Cloud detection with boundary nets

In satellite optical images, clouds are normally exhibited at different scales with various boundaries. In order to accurately capture the variable visual forms of clouds, we present a deep learning based strategy, i.e., Boundary Nets, which generates a cloud mask for detecting clouds in one cloudy...

Full description

Saved in:
Bibliographic Details
Published inISPRS journal of photogrammetry and remote sensing Vol. 186; pp. 218 - 231
Main Authors Wu, Kang, Xu, Zunxiao, Lyu, Xinrong, Ren, Peng
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.04.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In satellite optical images, clouds are normally exhibited at different scales with various boundaries. In order to accurately capture the variable visual forms of clouds, we present a deep learning based strategy, i.e., Boundary Nets, which generates a cloud mask for detecting clouds in one cloudy image. The Boundary Nets consist of two nets, i.e., (a) a scalable-boundary net, and (b) a differentiable-boundary net. The scalable-boundary net extracts multi-scale features from a cloudy image, and comprehensively characterizes clouds with variable boundary scales by a multi-scale fusion module. The multi-scale feature extraction and multi-scale fusion consistently capture clouds of different sizes, generating a multi-scale cloud mask for the cloudy image. The differentiable-boundary net characterizes the difference between the multi-scale cloud mask and the ground truth cloud mask by a residual architecture. It generates a difference cloud mask that is a complement of boundary details to the multi-scale cloud mask. Finally, the overall cloud mask is obtained by fusing the multi-scale cloud mask and the difference cloud mask. In the training process, multiple key parts of the Boundary Nets access supervision information in a distributed manner, and the losses are summed up for an overall training computation. Such distributed, overall supervision not only avoids training the two nets separately but also tightly couples the two nets into an overall framework. The experimental results validate that our Boundary Nets perform well and achieve outstanding results. The code for implementing the proposed boundary nets is available at https://gitee.com/kang_wu/boundary-nets.
ISSN:0924-2716
1872-8235
DOI:10.1016/j.isprsjprs.2022.02.010