Concept Bottleneck Language Models For protein design

We introduce Concept Bottleneck Protein Language Models (CB-pLM), a generative masked language model with a layer where each neuron corresponds to an interpretable concept. Our architecture offers three key benefits: i) Control: We can intervene on concept values to precisely control the properties...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Aya Abdelsalam Ismail, Oikarinen, Tuomas, Wang, Amy, Adebayo, Julius, Stanton, Samuel, Taylor, Joren, Kleinhenz, Joseph, Goodman, Allen, Héctor Corrada Bravo, Cho, Kyunghyun, Frey, Nathan C
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 09.11.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We introduce Concept Bottleneck Protein Language Models (CB-pLM), a generative masked language model with a layer where each neuron corresponds to an interpretable concept. Our architecture offers three key benefits: i) Control: We can intervene on concept values to precisely control the properties of generated proteins, achieving a 3 times larger change in desired concept values compared to baselines. ii) Interpretability: A linear mapping between concept values and predicted tokens allows transparent analysis of the model's decision-making process. iii) Debugging: This transparency facilitates easy debugging of trained models. Our models achieve pre-training perplexity and downstream task performance comparable to traditional masked protein language models, demonstrating that interpretability does not compromise performance. While adaptable to any language model, we focus on masked protein language models due to their importance in drug discovery and the ability to validate our model's capabilities through real-world experiments and expert knowledge. We scale our CB-pLM from 24 million to 3 billion parameters, making them the largest Concept Bottleneck Models trained and the first capable of generative language modeling.
ISSN:2331-8422