Reconstructing interpretable features in computational super-resolution microscopy via regularized latent search

Supervised deep learning approaches can artificially increase the resolution of microscopy images by learning a mapping between two image resolutions or modalities. However, such methods often require a large set of hard-to-get low-res/high-res image pairs and produce synthetic images with a moderat...

Full description

Saved in:
Bibliographic Details
Published inBiological imaging (Cambridge, England) Vol. 4; pp. e8 - 20
Main Authors Gheisari, Marzieh, Genovesio, Auguste
Format Journal Article
LanguageEnglish
Published England Cambridge University Press 2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Supervised deep learning approaches can artificially increase the resolution of microscopy images by learning a mapping between two image resolutions or modalities. However, such methods often require a large set of hard-to-get low-res/high-res image pairs and produce synthetic images with a moderate increase in resolution. Conversely, recent methods based on generative adversarial network (GAN) latent search offered a drastic increase in resolution without the need of paired images. However, they offer limited reconstruction of the high-resolution (HR) image interpretable features. Here, we propose a robust super-resolution (SR) method based on regularized latent search (RLS) that offers an actionable balance between fidelity to the ground truth (GT) and realism of the recovered image given a distribution prior. The latter allows to split the analysis of a low-resolution (LR) image into a computational SR task performed by deep learning followed by a quantification task performed by a handcrafted algorithm based on interpretable biological features. This two-step process holds potential for various applications such as diagnostics on mobile devices, where the main aim is not to recover the HR details of a specific sample but rather to obtain HR images that preserve explainable and quantifiable differences between conditions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2633-903X
2633-903X
DOI:10.1017/S2633903X24000084