Propagation graph fusion for multi-modal medical content-based retrieval

Medical content-based retrieval (MCBR) plays an important role in computer aided diagnosis and clinical decision support. Multi-modal imaging data have been increasingly used in MCBR, as they could provide more insights of the diseases and complement the deficiencies of single-modal data. However, i...

Full description

Saved in:
Bibliographic Details
Published in2014 13th International Conference on Control Automation Robotics & Vision (ICARCV) pp. 849 - 854
Main Authors Sidong Liu, Siqi Liu, Pujol, Sonia, Kikinis, Ron, Dagan Feng, Weidong Cai
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.12.2014
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Medical content-based retrieval (MCBR) plays an important role in computer aided diagnosis and clinical decision support. Multi-modal imaging data have been increasingly used in MCBR, as they could provide more insights of the diseases and complement the deficiencies of single-modal data. However, it is very challenging to fuse data in different modalities since they have different physical fundamentals and large value range variations. In this study, we propose a novel Propagation Graph Fusion (PGF) framework for multi-modal medical data retrieval. PGF models the subjects' relationships in single modalities using the directed propagation graphs, and then fuses the graphs into a single graph by summing up the edge weights. Our proposed PGF method could reduce the large inter-modality and inter-subject variations, and can be solved efficiently using the PageRank algorithm. We test the proposed method on a public medical database with 331 subjects using features extracted from two imaging modalities, PET and MRI. The preliminary results show that our PGF method could enhance multi-modal retrieval and modestly outperform the state-of-the-art single-modal and multi-modal retrieval methods.
DOI:10.1109/ICARCV.2014.7064415