A survey of techniques for optimizing transformer inference

Recent years have seen a phenomenal rise in the performance and applications of transformer neural networks. The family of transformer networks, including Bidirectional Encoder Representations from Transformer (BERT), Generative Pretrained Transformer (GPT) and Vision Transformer (ViT), have shown t...

Full description

Saved in:
Bibliographic Details
Published inJournal of systems architecture Vol. 144; no. C; p. 102990
Main Authors Chitty-Venkata, Krishna Teja, Mittal, Sparsh, Emani, Murali, Vishwanath, Venkatram, Somani, Arun K.
Format Journal Article
LanguageEnglish
Published Netherlands Elsevier B.V 01.11.2023
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recent years have seen a phenomenal rise in the performance and applications of transformer neural networks. The family of transformer networks, including Bidirectional Encoder Representations from Transformer (BERT), Generative Pretrained Transformer (GPT) and Vision Transformer (ViT), have shown their effectiveness across Natural Language Processing (NLP) and Computer Vision (CV) domains. Transformer-based networks such as ChatGPT have impacted the lives of common men. However, the quest for high predictive performance has led to an exponential increase in transformers’ memory and compute footprint. Researchers have proposed techniques to optimize transformer inference at all levels of abstraction. This paper presents a comprehensive survey of techniques for optimizing the inference phase of transformer networks. We survey techniques such as knowledge distillation, pruning, quantization, neural architecture search and lightweight network design at the algorithmic level. We further review hardware-level optimization techniques and the design of novel hardware accelerators for transformers. We summarize the quantitative results on the number of parameters/FLOPs and the accuracy of several models/techniques to showcase the tradeoff exercised by them. We also outline future directions in this rapidly evolving field of research. We believe that this survey will educate both novice and seasoned researchers and also spark a plethora of research efforts in this field.
Bibliography:USDOE
DEAC02-06CH11357
ISSN:1383-7621
1873-6165
DOI:10.1016/j.sysarc.2023.102990