Medical Image Synthesis for Data Augmentation and Anonymization Using Generative Adversarial Networks

Data diversity is critical to success when training deep learning models. Medical imaging data sets are often imbalanced as pathologic findings are generally rare, which introduces significant challenges when training deep learning models. In this work, we propose a method to generate synthetic abno...

Full description

Saved in:
Bibliographic Details
Published inSimulation and Synthesis in Medical Imaging Vol. 11037; pp. 1 - 11
Main Authors Shin, Hoo-Chang, Tenenholtz, Neil A., Rogers, Jameson K., Schwarz, Christopher G., Senjem, Matthew L., Gunter, Jeffrey L., Andriole, Katherine P., Michalski, Mark
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 01.01.2018
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Data diversity is critical to success when training deep learning models. Medical imaging data sets are often imbalanced as pathologic findings are generally rare, which introduces significant challenges when training deep learning models. In this work, we propose a method to generate synthetic abnormal MRI images with brain tumors by training a generative adversarial network using two publicly available data sets of brain MRI. We demonstrate two unique benefits that the synthetic images provide. First, we illustrate improved performance on tumor segmentation by leveraging the synthetic images as a form of data augmentation. Second, we demonstrate the value of generative models as an anonymization tool, achieving comparable tumor segmentation results when trained on the synthetic data versus when trained on real subject data. Together, these results offer a potential solution to two of the largest challenges facing machine learning in medical imaging, namely the small incidence of pathological findings, and the restrictions around sharing of patient data.
ISBN:3030005356
9783030005351
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-030-00536-8_1