PARALLEL DECODING USING TRANSFORMER MODELS
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can...
Saved in:
Main Authors | , , |
---|---|
Format | Patent |
Language | English |
Published |
12.03.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can score sequences in sublinear time. By generating predictions for multiple time steps at once then backing off to a longest prefix validated by the scoring model, the methods can substantially improve the speed of greedy decoding without compromising performance. |
---|---|
Bibliography: | Application Number: US201916682611 |