HCFNN: High-order coverage function neural network for image classification
•A more flexible HCF neuron model for DNNs is introduced; it constructs geometries in an n-dimensional space by changing weights and hyper-parameters and thus, possesses higher variability and plasticity. Furthermore, the approximation theorem and proof for arbitrary continuous infinite functions ar...
Saved in:
Published in | Pattern recognition Vol. 131; p. 108873 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Ltd
01.11.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •A more flexible HCF neuron model for DNNs is introduced; it constructs geometries in an n-dimensional space by changing weights and hyper-parameters and thus, possesses higher variability and plasticity. Furthermore, the approximation theorem and proof for arbitrary continuous infinite functions are presented, and the fitting ability of the HCF neuron 95 model is demonstrated.•HCFNN architecture based on the HCF neuron is proposed; it is used to mine specific feature representations and achieve adaptive parameter learning. Next, a novel adaptive optimization method for weights and hyper-parameters is proposed to achieve effective network learning. The 100 learned network model has better expression and learning ability with fewer neurons.•We conduct experiments on nine datasets in several domains, including the two-spirals problem, natural object recognition, face recognition, and person re-ID. Experimental results show that the proposed method has better 105 learning performance and generalization ability than the commonly used M-P and RBF neural networks. In addition, our method can improve the performance of various image recognition tasks and acquire good generalization.
Recent advances in deep neural networks (DNNs) have mainly focused on innovations in network architecture and loss function. In this paper, we introduce a flexible high-order coverage function (HCF) neuron model to replace the fully-connected (FC) layers. The approximation theorem and proof for the HCF are also presented to demonstrate its fitting ability. Unlike the FC layers, which cannot handle high-dimensional data well, the HCF utilizes weight coefficients and hyper-parameters to mine underlying geometries with arbitrary shapes in an n-dimensional space. To explore the power and potential of our HCF neuron model, a high-order coverage function neural network (HCFNN) is proposed, which incorporates the HCF neuron as the building block. Moreover, a novel adaptive optimization method for weights and hyper-parameters is designed to achieve effective network learning. Comprehensive experiments on nine datasets in several domains validate the effectiveness and generalizability of the HCF and HCFNN. The proposed method provides a new perspective for further developments in DNNs and ensures wide application in the field of image classification. The source code is available at https://github.com/Tough2011/HCFNet.git |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2022.108873 |