US-GAN: Ultrasound Image-Specific Feature Decomposition for Fine Texture Transfer

Ultrasound images acquired through various measuring devices may have different styles, and each style may be specialized for diagnosing specific diseases. Accordingly, ultrasound image-to-image translation (US I2I) has become an essential research field. However, direct application of conventional...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 12; pp. 72860 - 72870
Main Authors Kim, Seongho, Song, Byung Cheol
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Ultrasound images acquired through various measuring devices may have different styles, and each style may be specialized for diagnosing specific diseases. Accordingly, ultrasound image-to-image translation (US I2I) has become an essential research field. However, direct application of conventional I2I techniques to US I2I is difficult because it causes content deformation and has the problem of not being able to accurately translate fine textures. To solve the aforementioned problems, this paper proposes a novel feature decomposition scheme specialized for US I2I. The proposed feature decomposition explicitly separates texture and content information in latent space. Then, fine textures of the US image are effectively translated through translation of only the texture features. Moreover, I2I is carried out in a way that minimizes changes to the original content through reuse of content features. In addition to the feature decomposition scheme, we present a contrastive loss designed for content preservation. Specifically, the contrastive loss can maximize the content preservation effect because it preferentially performs query selection, which allows regions containing organ structures to be selected as queries (i.e., anchors). The proposed US image-specific learning scheme leads to qualitatively superior results, and the excellence of each method has been experimentally verified through various quantitative metrics.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3404071