Local minima found in the subparameter space can be effective for ensembles of deep convolutional neural networks

•Local minima found in the subparameter space can be effective for ensembles of deep convolutional neural networks (CNNs).•Finding local minima in the subparameter space makes the training stage for ensembles of deep CNNs more affordable.•Multiple models obtained at the found local minima can be sel...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 109; p. 107582
Main Authors Yang, Yongquan, Lv, Haijun, Chen, Ning, Wu, Yang, Zheng, Jiayi, Zheng, Zhongxi
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.01.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•Local minima found in the subparameter space can be effective for ensembles of deep convolutional neural networks (CNNs).•Finding local minima in the subparameter space makes the training stage for ensembles of deep CNNs more affordable.•Multiple models obtained at the found local minima can be selected to achieve better ensemble results via ensemble selection.•The selected models for ensemble of deep CNNs can be fused in the subparameter space to reduce the expense at the testing stage. Ensembles of deep convolutional neural networks (CNNs), which integrate multiple deep CNN models to achieve better generalization for an artificial intelligence application, now play an important role in ensemble learning due to the dominant position of deep learning. However, the usage of ensembles of deep CNNs is still not adequate because the increasing complexity of deep CNN architectures and the emerging data with large dimensionality have made the training stage and testing stage of ensembles of deep CNNs inevitably expensive. To alleviate this situation, we propose a new approach that finds multiple models converging to local minima in subparameter space for ensembles of deep CNNs. The subparameter space here refers to the space constructed by a partial selection of parameters, instead of the entire set of parameters, of a deep CNN architecture. We show that local minima found in the subparameter space of a deep CNN architecture can in fact be effective for ensembles of deep CNNs to achieve better generalization. Moreover, finding local minima in the subparameter space of a deep CNN architecture is more affordable at the training stage, and the multiple models at the found local minima can also be selectively fused to achieve better ensemble generalization while limiting the expense to a single deep CNN model at the testing stage. Demonstrations of MobilenetV2, Resnet50 and InceptionV4 (deep CNN architectures from lightweight to complex) on ImageNet, CIFAR-10 and CIFAR-100, respectively, lead us to believe that finding local minima in the subparameter space of a deep CNN architecture could be leveraged to broaden the usage of ensembles of deep CNNs.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2020.107582