Transfer Learning for Leaf Small Dataset Using Improved ResNet50 Network with Mixed Activation Functions

Taxonomic studies of leaves are one of the most effective means of correctly identifying plant species. In this paper, mixed activation function is used to improve the ResNet50 network in order to further improve the accuracy of leaf recognition. Firstly, leaf images of 15 common tree species in nor...

Full description

Saved in:
Bibliographic Details
Published inForests Vol. 13; no. 12; p. 2072
Main Authors Zhang, Ruolei, Zhu, Yijun, Ge, Zhangshangjie, Mu, Hongbo, Qi, Dawei, Ni, Haiming
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.12.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Taxonomic studies of leaves are one of the most effective means of correctly identifying plant species. In this paper, mixed activation function is used to improve the ResNet50 network in order to further improve the accuracy of leaf recognition. Firstly, leaf images of 15 common tree species in northern China were collected from the Urban Forestry Demonstration Base of Northeast Forestry University (45°43′–45°44′ N, 126°37′–126°38′ E, forest type was artificial forest), and a small leaf dataset was established. After that, seven commonly used activation functions were selected to improve the ResNet50 network structure, and the improved network was applied to the transfer learning research of the leaf small dataset. On this basis, five activation functions with better performance were selected for the study of mixed activation functions in deep learning. Two of these five activation functions are arbitrarily selected for combination, and a total of twenty combinations are obtained. Further, the first activation function was used in each combination to replace the first ReLU function after all addition operations in the ResNet50 network residual block structure, and another activation function was used to replace the other position ReLU functions. The experimental results show that in the transfer learning of the leaf small dataset using the ResNet50 deep residual network, the appropriate combination of mixed activation functions can increase the performance of the improved network to a certain extent. Among them, the ELU-Swish1 combination has the most significant improvement effect on the network performance, whose final effective validation accuracy reaches 98.17%. Furthermore, the comparison with GoogLeNet and VGG-16 also demonstrates the excellent performance of the improved ELU-Swish1 ResNet50 (ES-ResNet50) network architecture. Finally, tests on the other two small leaf datasets, Flavia and Swedish, also demonstrate the performance improvement of ES-ResNet50. The validation accuracy of the improved ES-Resnet 50 algorithm on these two datasets reaches 99.30% and 99.39%, respectively. All these experiments prove that the recognition performance of leaf transfer learning using the ES-ResNet50 network is indeed improved, which may be caused by the complementarity of the e-exponential gradient of ELU and Swish1 activation functions in the negative region.
ISSN:1999-4907
1999-4907
DOI:10.3390/f13122072