Energy Consumption Analysis of pruned Semantic Segmentation Networks on an Embedded GPU

Deep neural networks are the state of the art in many computer vision tasks. Their deployment in the context of autonomous vehicles is of particular interest, since their limitations in terms of energy consumption prohibit the use of very large networks, that typically reach the best performance. A...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Tessier, Hugo, Gripon, Vincent, Léonardon, Mathieu, Arzel, Matthieu, Bertrand, David, Hannagan, Thomas
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 13.06.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep neural networks are the state of the art in many computer vision tasks. Their deployment in the context of autonomous vehicles is of particular interest, since their limitations in terms of energy consumption prohibit the use of very large networks, that typically reach the best performance. A common method to reduce the complexity of these architectures, without sacrificing accuracy, is to rely on pruning, in which the least important portions are eliminated. There is a large literature on the subject, but interestingly few works have measured the actual impact of pruning on energy. In this work, we are interested in measuring it in the specific context of semantic segmentation for autonomous driving, using the Cityscapes dataset. To this end, we analyze the impact of recently proposed structured pruning methods when trained architectures are deployed on a Jetson Xavier embedded GPU.
Bibliography:SourceType-Working Papers-1
ObjectType-Working Paper/Pre-Print-1
content type line 50
ISSN:2331-8422
DOI:10.48550/arxiv.2206.06255