CellMemory: hierarchical interpretation of out-of-distribution cells using bottlenecked transformer

Machine learning methods, especially Transformer architectures, have been widely employed in single-cell omics studies. However, interpretability and accurate representation of out-of-distribution (OOD) cells remains challenging. Inspired by the global workspace theory in cognitive neuroscience, we...

Full description

Saved in:
Bibliographic Details
Published inGenome Biology Vol. 26; no. 1; pp. 178 - 37
Main Authors Wang, Qifei, Zhu, He, Hu, Yiwen, Chen, Yanjie, Wang, Yuwei, Li, Guochao, Li, Yun, Chen, Jinfeng, Zhang, Xuegong, Zou, James, Kellis, Manolis, Li, Yue, Liu, Dianbo, Jiang, Lan
Format Journal Article
LanguageEnglish
Published England BioMed Central 23.06.2025
BMC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Machine learning methods, especially Transformer architectures, have been widely employed in single-cell omics studies. However, interpretability and accurate representation of out-of-distribution (OOD) cells remains challenging. Inspired by the global workspace theory in cognitive neuroscience, we introduce CellMemory, a bottlenecked Transformer with improved generalizability designed for the hierarchical interpretation of OOD cells. Without pre-training, CellMemory outperforms existing single-cell foundation models and accurately deciphers spatial transcriptomics at high resolution. Leveraging its robust representations, we further elucidate malignant cells and their founder cells across patients, providing reliable characterizations of the cellular changes caused by the disease.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1474-760X
1474-7596
1474-760X
DOI:10.1186/s13059-025-03638-y