Multiple classifier for concatenate-designed neural network

This article introduces a multiple classifier method to improve the performance of concatenate-designed neural networks, such as ResNet and DenseNet, with the purpose of alleviating the pressure on the final classifier. We give the design of the classifiers, which collects the features produced betw...

Full description

Saved in:
Bibliographic Details
Published inNeural computing & applications Vol. 34; no. 2; pp. 1359 - 1372
Main Authors Chan, Ka-Hou, Im, Sio-Kei, Ke, Wei
Format Journal Article
LanguageEnglish
Published London Springer London 2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This article introduces a multiple classifier method to improve the performance of concatenate-designed neural networks, such as ResNet and DenseNet, with the purpose of alleviating the pressure on the final classifier. We give the design of the classifiers, which collects the features produced between the network sets, and present the constituent layers and the activation function for the classifiers, to calculate the classification score of each classifier. We use the L 2 e x normalization method to obtain the classifier score instead of the Softmax normalization. We also determine the conditions that can enhance convergence. As a result, the proposed classifiers are able to improve the accuracy in the experimental cases significantly and show that the method not only has better performance than the original models, but also produces faster convergence. Moreover, our classifiers are general and can be applied to all classification related concatenate-designed network models.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-021-06462-0