Graph Expansion in Pruned Recurrent Neural Network Layers Preserve Performance

Expansion property of a graph refers to its strong connectivity as well as sparseness. It has been reported that deep neural networks can be pruned to a high degree of sparsity while maintaining their performance. Such pruning is essential for performing real time sequence learning tasks using recur...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Kalra, Suryam Arnav, Biswas, Arindam, Mitra, Pabitra, Basu, Biswajit
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 17.03.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Expansion property of a graph refers to its strong connectivity as well as sparseness. It has been reported that deep neural networks can be pruned to a high degree of sparsity while maintaining their performance. Such pruning is essential for performing real time sequence learning tasks using recurrent neural networks in resource constrained platforms. We prune recurrent networks such as RNNs and LSTMs, maintaining a large spectral gap of the underlying graphs and ensuring their layerwise expansion properties. We also study the time unfolded recurrent network graphs in terms of the properties of their bipartite layers. Experimental results for the benchmark sequence MNIST, CIFAR-10, and Google speech command data show that expander graph properties are key to preserving classification accuracy of RNN and LSTM.
ISSN:2331-8422