Leaf Segmentation and Classification with a Complicated Background Using Deep Learning

The segmentation and classification of leaves in plant images are a great challenge, especially when several leaves are overlapping in images with a complicated background. In this paper, the segmentation and classification of leaf images with a complicated background using deep learning are studied...

Full description

Saved in:
Bibliographic Details
Published inAgronomy (Basel) Vol. 10; no. 11; p. 1721
Main Authors Yang, Kunlong, Zhong, Weizhen, Li, Fengguo
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.11.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The segmentation and classification of leaves in plant images are a great challenge, especially when several leaves are overlapping in images with a complicated background. In this paper, the segmentation and classification of leaf images with a complicated background using deep learning are studied. First, more than 2500 leaf images with a complicated background are collected and artificially labeled with target pixels and background pixels. Two-thousand of them are fed into a Mask Region-based Convolutional Neural Network (Mask R-CNN) to train a model for leaf segmentation. Then, a training set that contains more than 1500 training images of 15 species is fed into a very deep convolutional network with 16 layers (VGG16) to train a model for leaf classification. The best hyperparameters for these methods are found by comparing a variety of parameter combinations. The results show that the average Misclassification Error (ME) of 80 test images using Mask R-CNN is 1.15%. The average accuracy value for the leaf classification of 150 test images using VGG16 is up to 91.5%. This indicates that these methods can be used to segment and classify the leaf image with a complicated background effectively. It could provide a reference for the phenotype analysis and automatic classification of plants.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2073-4395
2073-4395
DOI:10.3390/agronomy10111721