A Neural Network Pruning Approach Based on Local and Global Statistical Distributions

Deep neural networks have now demonstrated unique advantages in multiple domains, but the high memory and computational power required to deploy these networks make it difficult for deep neural networks to leverage their strengths on edge computing devices. To overcome this issue, we describe a prun...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Systems and Informatics pp. 1 - 5
Main Authors Liu, Bowen, Xie, Xuesong, Zhang, Xiaoling
Format Conference Proceeding
LanguageEnglish
Published IEEE 14.12.2024
Subjects
Online AccessGet full text
ISSN2689-7148
DOI10.1109/ICSAI65059.2024.10893846

Cover

Loading…
More Information
Summary:Deep neural networks have now demonstrated unique advantages in multiple domains, but the high memory and computational power required to deploy these networks make it difficult for deep neural networks to leverage their strengths on edge computing devices. To overcome this issue, we describe a pruning method that integrates local and global statistical distributions, utilizing the scale factors and running variances from batch normalization layers to identify redundant filters, and then obtaining a compact model by removing these redundant filters. On the VOC2007 dataset, we pruned 68.3% of the parameters and 51.9% of the computational cost on the YOLOV5s network with only a 6.5% loss in mAP_0.5. On the VOC2012 dataset, we pruned 69.7% of the parameters and 51.8% of the computational cost on the YOLOV5s network with only a 5.9% loss in mAP_0.5.
ISSN:2689-7148
DOI:10.1109/ICSAI65059.2024.10893846