A direct comparison of high-speed methods for the numerical Abel transform

The Abel transform is a mathematical operation that transforms a cylindrically symmetric three-dimensional (3D) object into its two-dimensional (2D) projection. The inverse Abel transform reconstructs the 3D object from the 2D projection. Abel transforms have wide application across numerous fields...

Full description

Saved in:
Bibliographic Details
Published inReview of scientific instruments Vol. 90; no. 6; pp. 065115 - 65123
Main Authors Hickstein, Daniel D., Gibson, Stephen T., Yurchak, Roman, Das, Dhrubajyoti D., Ryazanov, Mikhail
Format Journal Article
LanguageEnglish
Published United States American Institute of Physics 01.06.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The Abel transform is a mathematical operation that transforms a cylindrically symmetric three-dimensional (3D) object into its two-dimensional (2D) projection. The inverse Abel transform reconstructs the 3D object from the 2D projection. Abel transforms have wide application across numerous fields of science, especially chemical physics, astronomy, and the study of laser-plasma plumes. Consequently, many numerical methods for the Abel transform have been developed, which makes it challenging to select the ideal method for a specific application. In this work, eight published transform methods have been incorporated into a single, open-source Python software package (PyAbel) to provide a direct comparison of the capabilities, advantages, and relative computational efficiency of each transform method. Most of the tested methods provide similar, high-quality results. However, the computational efficiency varies across several orders of magnitude. By optimizing the algorithms, we find that some transform methods are sufficiently fast to transform 1-megapixel images at more than 100 frames per second on a desktop personal computer. In addition, we demonstrate the transform of gigapixel images.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0034-6748
1089-7623
1089-7623
DOI:10.1063/1.5092635