Unsupervised Anomaly Detection with Generative Adversarial Networks to Guide Marker Discovery

Obtaining models that capture imaging markers relevant for disease progression and treatment monitoring is challenging. Models are typically based on large amounts of data with annotated examples of known markers aiming at automating detection. High annotation effort and the limitation to a vocabula...

Full description

Saved in:
Bibliographic Details
Published inInformation Processing in Medical Imaging Vol. 10265; pp. 146 - 157
Main Author Niethammer, Marc
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2017
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Obtaining models that capture imaging markers relevant for disease progression and treatment monitoring is challenging. Models are typically based on large amounts of data with annotated examples of known markers aiming at automating detection. High annotation effort and the limitation to a vocabulary of known markers limit the power of such approaches. Here, we perform unsupervised learning to identify anomalies in imaging data as candidates for markers. We propose AnoGAN, a deep convolutional generative adversarial network to learn a manifold of normal anatomical variability, accompanying a novel anomaly scoring scheme based on the mapping from image space to a latent space. Applied to new data, the model labels anomalies, and scores image patches indicating their fit into the learned distribution. Results on optical coherence tomography images of the retina demonstrate that the approach correctly identifies anomalous images, such as images containing retinal fluid or hyperreflective foci.
Bibliography:T. Schlegl—This work has received funding from IBM, FWF (I2714-B31), OeNB (15356, 15929), the Austrian Federal Ministry of Science, Research and Economy (CDL OPTIMA).
ISBN:9783319590493
3319590499
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-59050-9_12