PARALLEL DECODING USING TRANSFORMER MODELS

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can...

Full description

Saved in:
Bibliographic Details
Main Authors Shazeer, Noam M, Stern, Mitchell Thomas, Uszkoreit, Jakob D
Format Patent
LanguageEnglish
Published 12.03.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can score sequences in sublinear time. By generating predictions for multiple time steps at once then backing off to a longest prefix validated by the scoring model, the methods can substantially improve the speed of greedy decoding without compromising performance.
Bibliography:Application Number: US201916682611