Graph Expansion in Pruned Recurrent Neural Network Layers Preserve Performance
Expansion property of a graph refers to its strong connectivity as well as sparseness. It has been reported that deep neural networks can be pruned to a high degree of sparsity while maintaining their performance. Such pruning is essential for performing real time sequence learning tasks using recur...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
17.03.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Expansion property of a graph refers to its strong connectivity as well as
sparseness. It has been reported that deep neural networks can be pruned to a
high degree of sparsity while maintaining their performance. Such pruning is
essential for performing real time sequence learning tasks using recurrent
neural networks in resource constrained platforms. We prune recurrent networks
such as RNNs and LSTMs, maintaining a large spectral gap of the underlying
graphs and ensuring their layerwise expansion properties. We also study the
time unfolded recurrent network graphs in terms of the properties of their
bipartite layers. Experimental results for the benchmark sequence MNIST,
CIFAR-10, and Google speech command data show that expander graph properties
are key to preserving classification accuracy of RNN and LSTM. |
---|---|
DOI: | 10.48550/arxiv.2403.11100 |