Code Prediction by Feeding Trees to Transformers

Code prediction, more specifically autocomplete, has become an essential feature in modern IDEs. Autocomplete is more effective when the desired next token is at (or close to) the top of the list of potential completions offered by the IDE at cursor position. This is where the strength of the underl...

Full description

Saved in:
Bibliographic Details
Published inProceedings / International Conference on Software Engineering pp. 150 - 162
Main Authors Kim, Seohyun, Zhao, Jinman, Tian, Yuchi, Chandra, Satish
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Code prediction, more specifically autocomplete, has become an essential feature in modern IDEs. Autocomplete is more effective when the desired next token is at (or close to) the top of the list of potential completions offered by the IDE at cursor position. This is where the strength of the underlying machine learning system that produces a ranked order of potential completions comes into play. We advance the state-of-the-art in the accuracy of code prediction (next token prediction) used in autocomplete systems. Our work uses Transformers as the base neural architecture. We show that by making the Transformer architecture aware of the syntactic structure of code, we increase the margin by which a Transformer-based system outperforms previous systems. With this, it outperforms the accuracy of several state-of-the-art next token prediction systems by margins ranging from 14% to 18%. We present in the paper several ways of communicating the code structure to the Transformer, which is fundamentally built for processing sequence data. We provide a comprehensive experimental evaluation of our proposal, along with alternative design choices, on a standard Python dataset, as well as on Facebook internal Python corpus. Our code and data preparation pipeline will be available in open source.
ISBN:1665402962
9781665402965
ISSN:1558-1225
DOI:10.1109/ICSE43902.2021.00026