Enhancing Precision Agriculture Pest Control: A Generalized Deep Learning Approach With YOLOv8-Based Insect Detection

Precision Agriculture (PA) is gaining new momentum due to its ability to accurately adjust the resources to a crop's needs while maintaining/enhancing quality levels. However, crop-damaging pests compromise yields, jeopardizing the benefits of PA. Computer vision-based pest detection techniques...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 12; pp. 84420 - 84434
Main Authors Vilar-Andreu, Mario, Garcia, Laura, Garcia-Sanchez, Antonio-Javier, Asorey-Cacheda, Rafael, Garcia-Haro, Joan
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Precision Agriculture (PA) is gaining new momentum due to its ability to accurately adjust the resources to a crop's needs while maintaining/enhancing quality levels. However, crop-damaging pests compromise yields, jeopardizing the benefits of PA. Computer vision-based pest detection techniques offer promising avenues to overcome potential losses for farmers. The recent object detection framework, YOLOv8 (You Only Look Once) applied to real-time insect monitoring is an open-source, cutting-edge PA approach based on Convolutional Neural Network (CNN) models that enables precise and quick decision making in agricultural crops. Under this umbrella, traditional pest studies using YOLO or other deep-learning solutions focused on only one or a few insects for specific crops provide an excessively narrow solution. In this paper, we propose a new form of using YOLO for pest detection with a generalist perspective by intensively testing a YOLOv8-based tool implementing a single insect category. The goal is to detect the presence of any type of insect in any type of crop in real time. A comprehensive performance evaluation is carried out using a well-known dataset. The results of the training, validation, and testing phases are then discussed, obtaining an <inline-formula> <tex-math notation="LaTeX">mAP_{50} </tex-math></inline-formula> value of 0.967 for the m model and an <inline-formula> <tex-math notation="LaTeX">mAP_{50-95} </tex-math></inline-formula> value of 0.632 for the l model. Finally, we also identify the premises to elaborate a complete and useful dataset able to unleash the full potential of YOLOv8.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3413979