BiSeNeXt: a yam leaf and disease segmentation method based on an improved BiSeNetV2 in complex scenes

IntroductionYam is an important medicinal and edible crop, but its quality and yield are greatly affected by leaf diseases. Currently, research on yam leaf disease segmentation remains unexplored. Challenges like leaf overlapping, uneven lighting and irregular disease spots in complex environments l...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in plant science Vol. 16
Main Authors Lu, Bibo, Lu, Yanjun, Liang, Di, Yang, Jie
Format Journal Article
LanguageEnglish
Published Frontiers Media S.A 05.08.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:IntroductionYam is an important medicinal and edible crop, but its quality and yield are greatly affected by leaf diseases. Currently, research on yam leaf disease segmentation remains unexplored. Challenges like leaf overlapping, uneven lighting and irregular disease spots in complex environments limit segmentation accuracy.MethodsTo address these challenges, this paper introduces the first yam leaf disease segmentation dataset and proposes BiSeNeXt, an enhanced method based on BiSeNetV2. Firstly, dynamic feature extraction block (DFEB) enhances the precision of leaf and disease edge pixels and reduces lesion omission through dynamic receptive-field convolution (DRFConv) and pixel shuffle (PixelShuffle) downsampling. Secondly, efficient asymmetric multi-scale attention (EAMA) effectively alleviates the problem of lesion adhesion by combining asymmetric convolution with a multi-scale parallel structure. Finally, PointRefine decoder adaptively selects uncertain points in the image predictions and refines them point-by-point, producing accurate segmentation of leaves and spots.ResultsExperimental results indicated that the approach achieved a 97.04% intersection over union (IoU) for leaf segmentation and an 84.75% IoU for disease segmentation. Compared to DeepLabV3+, the proposed method improves the IoU of leaf and disease segmentation by 2.22% and 5.58%, respectively. Additionally, the FLOPs and total number of parameters of the proposed method require only 11.81% and 7.81% of DeepLabV3+, respectively.DiscussionTherefore, the proposed method can efficiently and accurately extract yam leaf spots in complex scenes, providing a solid foundation for analyzing yam leaves and diseases.
Bibliography:Reviewed by: Nitin Goyal, Central University of Haryana, India
Edited by: Thomas Thomidis, International Hellenic University, Greece
Yan Guo, Henan Academy of Agricultural Sciences, China
ISSN:1664-462X
1664-462X
DOI:10.3389/fpls.2025.1602102