Forest pest monitoring and early warning using UAV remote sensing and computer vision techniques

Unmanned aerial vehicle (UAV) remote sensing has revolutionized forest pest monitoring and early warning systems. However, the susceptibility of UAV-based object detection models to adversarial attacks raises concerns about their reliability and robustness in real-world deployments. To address this...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 15; no. 1; pp. 401 - 20
Main Authors Li, Xiaoyu, Wang, AChuan
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 02.01.2025
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Unmanned aerial vehicle (UAV) remote sensing has revolutionized forest pest monitoring and early warning systems. However, the susceptibility of UAV-based object detection models to adversarial attacks raises concerns about their reliability and robustness in real-world deployments. To address this challenge, we propose SC-RTDETR, a novel framework for secure and robust object detection in forest pest monitoring using UAV imagery. SC-RTDETR integrates a soft-thresholding adaptive filtering module and a cascaded group attention mechanism into the Real-time Detection Transformer (RTDETR) architecture, significantly enhancing its resilience against adversarial perturbations. Extensive experiments on a real-world pine wilt disease dataset demonstrate the superior performance of SC-RTDETR, with an improvement of 7.1% in mean Average Precision (mAP) and 6.5% in F1-score under strong adversarial attack conditions compared to state-of-the-art methods. The ablation studies and visualizations provide insights into the effectiveness of the proposed components, validating their contributions to the overall robustness and performance of SC-RTDETR. Our framework offers a promising solution for accurate and reliable forest pest monitoring in non-secure environments.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-84464-3