Cross-supervised learning for cloud detection

We present a new learning paradigm, that is, cross-supervised learning, and explore its use for cloud detection. The cross-supervised learning paradigm is characterized by both supervised training and mutually supervised training, and is performed by two base networks. In addition to the individual...

Full description

Saved in:
Bibliographic Details
Published inGIScience and remote sensing Vol. 60; no. 1
Main Authors Wu, Kang, Xu, Zunxiao, Lyu, Xinrong, Ren, Peng
Format Journal Article
LanguageEnglish
Published Taylor & Francis 31.12.2023
Taylor & Francis Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a new learning paradigm, that is, cross-supervised learning, and explore its use for cloud detection. The cross-supervised learning paradigm is characterized by both supervised training and mutually supervised training, and is performed by two base networks. In addition to the individual supervised training for labeled data, the two base networks perform the mutually supervised training using prediction results provided by each other for unlabeled data. Specifically, we develop In-extensive Nets for implementing the base networks. The In-extensive Nets consist of two Intensive Nets and are trained using the cross-supervised learning paradigm. The Intensive Net leverages information from the labeled cloudy images using a focal attention guidance module (FAGM) and a regression block. The cross-supervised learning paradigm empowers the In-extensive Nets to learn from both labeled and unlabeled cloudy images, substantially reducing the number of labeled cloudy images (that tend to cost expensive manual effort) required for training. Experimental results verify that In-extensive Nets perform well and have an obvious advantage in the situations where there are only a few labeled cloudy images available for training. The implementation code for the proposed paradigm is available at https://gitee.com/kang_wu/in-extensive-nets .
ISSN:1548-1603
1943-7226
DOI:10.1080/15481603.2022.2147298