Transformer-based personalized attention mechanism for medical images with clinical records

In medical image diagnosis, identifying the attention region, i.e., the region of interest for which the diagnosis is made, is an important task. Various methods have been developed to automatically identify target regions from given medical images. However, in actual medical practice, the diagnosis...

Full description

Saved in:
Bibliographic Details
Published inJournal of pathology informatics Vol. 14; p. 100185
Main Authors Takagi, Yusuke, Hashimoto, Noriaki, Masuda, Hiroki, Miyoshi, Hiroaki, Ohshima, Koichi, Hontani, Hidekata, Takeuchi, Ichiro
Format Journal Article
LanguageEnglish
Published United States Elsevier Inc 01.01.2023
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In medical image diagnosis, identifying the attention region, i.e., the region of interest for which the diagnosis is made, is an important task. Various methods have been developed to automatically identify target regions from given medical images. However, in actual medical practice, the diagnosis is made based on both the images and various clinical records. Consequently, pathologists examine medical images with prior knowledge of the patients and the attention regions may change depending on the clinical records. In this study, we propose a method, called the Personalized Attention Mechanism (PersAM) method, by which the attention regions in medical images according to the clinical records. The primary idea underlying the PersAM method is the encoding of the relationships between medical images and clinical records using a variant of the Transformer architecture. To demonstrate the effectiveness of the PersAM method, we applied it to a large-scale digital pathology problem involving identifying the subtypes of 842 malignant lymphoma patients based on their gigapixel whole-slide images and clinical records.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2153-3539
2229-5089
2153-3539
DOI:10.1016/j.jpi.2022.100185