Hierarchical graph representations in digital pathology
•Hierarchical Cell-to-Tissue (HACT) representation: A novel multi-level hierarchical entity-graph representation of a histology image to model the hierarchical composition of the tissue by encoding comprehensible histological entities (cells and tissue-regions) as well as the intra- and inter-entity...
Saved in:
Published in | Medical image analysis Vol. 75; p. 102264 |
---|---|
Main Authors | , , , , , , , , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Netherlands
Elsevier B.V
01.01.2022
Elsevier BV |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •Hierarchical Cell-to-Tissue (HACT) representation: A novel multi-level hierarchical entity-graph representation of a histology image to model the hierarchical composition of the tissue by encoding comprehensible histological entities (cells and tissue-regions) as well as the intra- and inter-entity level interactions.•HACT-Net: A hierarchical graph neural network to operate on the hierarchical entity-graph representation to map the tissue structure to tissue functionality.•BReAst Carcinoma Subtyping (BRACS) dataset: Introduce (BRACS) dataset, a large cohort of Haematoxylin & Eosin-stained breast tumor regions-of-interest.•Domain expert comparison: Benchmarking of the proposed methodology with three expert pathologists on the BRACS test set.•Quantitative evaluation: Experimentations on BRACS dataset and public BACH dataset to demonstrate the efficacy of the proposed methodology in breast cancer subtyping compared to state-of-the-art computer-aided diagnostic approaches.•Qualitative evaluation: Demonstration of salient regions in the histopathology image during the inference with HACT-Net.
[Display omitted]
Cancer diagnosis, prognosis, and therapy response predictions from tissue specimens highly depend on the phenotype and topological distribution of constituting histological entities. Thus, adequate tissue representations for encoding histological entities is imperative for computer aided cancer patient care. To this end, several approaches have leveraged cell-graphs, capturing the cell-microenvironment, to depict the tissue. These allow for utilizing graph theory and machine learning to map the tissue representation to tissue functionality, and quantify their relationship. Though cellular information is crucial, it is incomplete alone to comprehensively characterize complex tissue structure. We herein treat the tissue as a hierarchical composition of multiple types of histological entities from fine to coarse level, capturing multivariate tissue information at multiple levels. We propose a novel multi-level hierarchical entity-graph representation of tissue specimens to model the hierarchical compositions that encode histological entities as well as their intra- and inter-entity level interactions. Subsequently, a hierarchical graph neural network is proposed to operate on the hierarchical entity-graph and map the tissue structure to tissue functionality. Specifically, for input histology images, we utilize well-defined cells and tissue regions to build HierArchical Cell-to-Tissue (HACT) graph representations, and devise HACT-Net, a message passing graph neural network, to classify the HACT representations. As part of this work, we introduce the BReAst Carcinoma Subtyping (BRACS) dataset, a large cohort of Haematoxylin & Eosin stained breast tumor regions-of-interest, to evaluate and benchmark our proposed methodology against pathologists and state-of-the-art computer-aided diagnostic approaches. Through comparative assessment and ablation studies, our proposed method is demonstrated to yield superior classification results compared to alternative methods as well as individual pathologists. The code, data, and models can be accessed at https://github.com/histocartography/hact-net. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1361-8415 1361-8423 1361-8423 |
DOI: | 10.1016/j.media.2021.102264 |