MolGPT: Molecular Generation Using a Transformer-Decoder Model
Application of deep learning techniques for generation of molecules, termed as inverse molecular design, has been gaining enormous traction in drug design. The representation of molecules in SMILES notation as a string of characters enables the usage of state of the art models in natural language pr...
Saved in:
Published in | Journal of chemical information and modeling Vol. 62; no. 9; pp. 2064 - 2076 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
American Chemical Society
09.05.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Application of deep learning techniques for
generation of molecules, termed as inverse molecular design, has been gaining enormous traction in drug design. The representation of molecules in SMILES notation as a string of characters enables the usage of state of the art models in natural language processing, such as Transformers, for molecular design in general. Inspired by generative pre-training (GPT) models that have been shown to be successful in generating meaningful text, we train a transformer-decoder on the next token prediction task using masked self-attention for the generation of druglike molecules in this study. We show that our model, MolGPT, performs on par with other previously proposed modern machine learning frameworks for molecular generation in terms of generating valid, unique, and novel molecules. Furthermore, we demonstrate that the model can be trained conditionally to control multiple properties of the generated molecules. We also show that the model can be used to generate molecules with desired scaffolds as well as desired molecular properties by conditioning the generation on scaffold SMILES strings of desired scaffolds and property values. Using saliency maps, we highlight the interpretability of the generative process of the model. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 ObjectType-Review-3 content type line 23 |
ISSN: | 1549-9596 1549-960X 1549-960X |
DOI: | 10.1021/acs.jcim.1c00600 |