Crop pest recognition using attention-embedded lightweight network under field conditions

Plant pests have a negative effect on crop yields. If the various insect pests are not identified and controlled properly, they can spread quickly and cause a significant decline in agricultural production. To overcome the challenges, the convolutional neural network (CNN)-based methods have shown e...

Full description

Saved in:
Bibliographic Details
Published inApplied entomology and zoology Vol. 56; no. 4; pp. 427 - 442
Main Authors Chen, Junde, Chen, Weirong, Zeb, Adnan, Zhang, Defu, Nanehkaran, Yaser Ahangari
Format Journal Article
LanguageEnglish
Published Tokyo Springer Japan 01.11.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Plant pests have a negative effect on crop yields. If the various insect pests are not identified and controlled properly, they can spread quickly and cause a significant decline in agricultural production. To overcome the challenges, the convolutional neural network (CNN)-based methods have shown excellent performance as it performs automatic feature extraction in image identification and classification. In this study, to enhance the learning capability for pest images with cluttered backgrounds, the MobileNet-V2 pre-trained on ImageNet was chosen as the backbone network and the attention mechanism along with a classification activation map (CAM) were incorporated in our architecture to learn the significant pest information of input images. Moreover, the optimized loss function and two-stage transfer learning were adopted in model training. This kind of progressive learning first makes the model discover the large-scale structures, and then shifts its attention to delicate details step by step, improving the identification accuracy of plant pest images. The proposed procedure achieves an average accuracy of 99.14% on the publicly available dataset, and even in heterogeneous background conditions, the average accuracy also reaches 92.79%. Experimental results prove the efficacy of the proposed procedure, and it delivers outperformance compared with other state-of-the-art methods.
ISSN:0003-6862
1347-605X
DOI:10.1007/s13355-021-00732-y