EACP: An effective automatic channel pruning for neural networks

The large data scale and computational resources required by Convolutional Neural Networks (CNNs) hinder the practical application on mobile devices. However, channel pruning has become one of the most efficient methods for addressing this problem, with many existing researches proving its practicab...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 526; pp. 131 - 142
Main Authors Liu, Yajun, Wu, Dakui, Zhou, Wenju, Fan, Kefeng, Zhou, Zhiheng
Format Journal Article
LanguageEnglish
Published Elsevier B.V 14.03.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The large data scale and computational resources required by Convolutional Neural Networks (CNNs) hinder the practical application on mobile devices. However, channel pruning has become one of the most efficient methods for addressing this problem, with many existing researches proving its practicability in the field of model compression. The current channel pruning methods mainly start with the perspective of assessing the importance of channels or manual setting of the evaluation criteria, which requires unnecessary human intervention and shows the lack of certain automaticity. In this paper, an effective automatic channel pruning (EACP) method for neural networks is proposed. Specifically, we adopt the k-means++ method to cluster filters with similar features hierarchically in each convolutional layer, forming an initial compact compression structure. Subsequently, we use an improved social group optimization (SGO) algorithm to iteratively search and optimize the compression process of the post-clustered structure to find the optimal compressed structure. The effectiveness of the proposed approach is tested with respect to three leading CNN models on two image classification datasets. In CIFAR-10, our method reduces the FLOPs of GoogLeNet by 58.10 % and improves the accuracy by 0.20 % compared to the baseline.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2023.01.014