Data-Aware Adaptive Pruning Model Compression Algorithm Based on a Group Attention Mechanism and Reinforcement Learning

The success of convolutional neural networks (CNNs) benefits from the stacking of convolutional layers, which improves the model's receptive field for image data but also causes a decrease in inference speed. To improve the inference speed of large convolutional network models without sacrifici...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 10; pp. 82396 - 82406
Main Authors Yang, Zhi, Zhai, Yuan, Xiang, Yi, Wu, Jianquan, Shi, Jinliang, Wu, Ying
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The success of convolutional neural networks (CNNs) benefits from the stacking of convolutional layers, which improves the model's receptive field for image data but also causes a decrease in inference speed. To improve the inference speed of large convolutional network models without sacrificing performance indicators too much, a data-aware adaptive pruning algorithm is proposed. The algorithm consists of two parts, namely, a channel pruning method based on the attention mechanism and a data-aware pruning policy based on reinforcement learning. Experimental results on the CIFAR-100 dataset show that the performance of the proposed pruning algorithm is reduced by only 2.05%, 1.93% and 5.66% after pruning the VGG19, ResNet56 and EfficientNet networks, respectively, but the speedup ratios are 3.63, 3.35, and 1.14, respectively, and the comprehensive pruning performance is the best. In addition, the generalization ability of the reconstruction model is evaluated on the ImageNet dataset and FGVC Aircraft dataset, and the performance of the proposed algorithm is the best, which shows that the proposed algorithm learns data-related information in the pruning process, that is, it is a data-aware algorithm.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3188119