Two Filters for Acquiring the Profiles from Images Obtained from Weak-Light Background, Fluorescence Microscope, Transmission Electron Microscope, and Near-Infrared Camera

Extracting the profiles of images is critical because it can bring simplified description and draw special attention to particular areas in the images. In our work, we designed two filters via the exponential and hypotenuse functions for profile extraction. Their ability to extract the profiles from...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 13; p. 6207
Main Authors Huang, Yinghui, Yang, Ruoxi, Geng, Xin, Li, Zongan, Wu, Ye
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 06.07.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Extracting the profiles of images is critical because it can bring simplified description and draw special attention to particular areas in the images. In our work, we designed two filters via the exponential and hypotenuse functions for profile extraction. Their ability to extract the profiles from the images obtained from weak-light conditions, fluorescence microscopes, transmission electron microscopes, and near-infrared cameras is proven. Moreover, they can be used to extract the nesting structures in the images. Furthermore, their performance in extracting images degraded by Gaussian noise is evaluated. We used Gaussian white noise with a mean value of 0.9 to create very noisy images. These filters are effective for extracting the edge morphology in the noisy images. For the purpose of a comparative study, we used several well-known filters to process these noisy images, including the filter based on Gabor wavelet, the filter based on the watershed algorithm, and the matched filter, the performances of which in profile extraction are either comparable or not effective when dealing with extensively noisy images. Our filters have shown the potential for use in the field of pattern recognition and object tracking.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s23136207