Machine learning denoising of high-resolution X-ray nano­tomography data

A high-performance denoising filter based on machine learning for high-resolution synchrotron nano­tomography data is analyzed and evaluated. High-resolution X-ray nano­tomography is a quantitative tool for investigating specimens from a wide range of research areas. However, the quality of the reco...

Full description

Saved in:
Bibliographic Details
Published inJournal of synchrotron radiation Vol. 29; no. Pt 1; pp. 230 - 238
Main Authors Flenner, Silja, Bruns, Stefan, Longo, Elena, Parnell, Andrew J., Stockhausen, Kilian E., Müller, Martin, Greving, Imke
Format Journal Article
LanguageEnglish
Published International Union of Crystallography 01.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A high-performance denoising filter based on machine learning for high-resolution synchrotron nano­tomography data is analyzed and evaluated. High-resolution X-ray nano­tomography is a quantitative tool for investigating specimens from a wide range of research areas. However, the quality of the reconstructed tomogram is often obscured by noise and therefore not suitable for automatic segmentation. Filtering methods are often required for a detailed quantitative analysis. However, most filters induce blurring in the reconstructed tomograms. Here, machine learning (ML) techniques offer a powerful alternative to conventional filtering methods. In this article, we verify that a self-supervised denoising ML technique can be used in a very efficient way for eliminating noise from nano­tomography data. The technique presented is applied to high-resolution nano­tomography data and compared to conventional filters, such as a median filter and a nonlocal means filter, optimized for tomographic data sets. The ML approach proves to be a very powerful tool that outperforms conventional filters by eliminating noise without blurring relevant structural features, thus enabling efficient quantitative analysis in different scientific fields.
ISSN:0909-0495
1600-5775
DOI:10.1107/S1600577521011139