Deep Learning With Weak Supervision for Disaster Scene Description in Low-Altitude Imagery

Pictures or videos captured from a low-altitude aircraft or an unmanned aerial vehicle are a fast and cost-effective way to survey the affected scene for the quick and precise assessment of a catastrophic event's impacts and damages. Using advanced techniques, such as deep learning, it is now p...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; pp. 1 - 10
Main Authors Presa-Reyes, Maria, Tao, Yudong, Chen, Shu-Ching, Shyu, Mei-Ling
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Pictures or videos captured from a low-altitude aircraft or an unmanned aerial vehicle are a fast and cost-effective way to survey the affected scene for the quick and precise assessment of a catastrophic event's impacts and damages. Using advanced techniques, such as deep learning, it is now possible to automate the description of disaster scenes and identify features in captured images or recorded videos to gain situational awareness. However, building a large-scale, high-quality dataset with annotated disaster-related features for supervised model training is time-consuming and costly. In this article, we propose a weakly supervised approach to train a deep neural network on low-altitude imagery with highly imbalanced and noisy crowd-sourced labels. We further make use of the rich spatiotemporal data obtained from the pictures and its sequence information to enhance the model's performance during training via label propagation. Our approach achieves the highest score among all the submitted runs in the TRECVID2020 Disaster Scene Description and Indexing (DSDI) Challenge, indicating its superior capabilities in retrieving disaster-related video clips compared to other proposed methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3129443