Restoration of dehaze and defog image using novel cross entropy-based deep learning neural network
Computer vision applications require high-quality images with a lot of information. Images captured in poor weather conditions such as fog or haze can affect the visibility of the images, which can lead to negative interferences. Fog/haze removal is a technique that can improve image quality and res...
Saved in:
Published in | Multimedia tools and applications Vol. 83; no. 20; pp. 58573 - 58606 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
23.12.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Computer vision applications require high-quality images with a lot of information. Images captured in poor weather conditions such as fog or haze can affect the visibility of the images, which can lead to negative interferences. Fog/haze removal is a technique that can improve image quality and restore the true details of the objects. Currently, the image dehazing methods, which focus primarily on enhancing the overall contrast of the image, fail to deliver quality results because the light source distribution is not specified, or the cost functions are not suited to mathematical constraints. This paper proposes an efficient restoration approach based on a Cross Entropy-Based Deep Learning Neural Network
(
CE-DLNN). Initially, the input images are pre-processed by using the Intensity-based Dynamic Fuzzy Histogram Equalization (IDFHE) method. Then, the Dark Channel Prior (DCP) of the enhanced image is estimated, and the corresponding transmission map. Then the important features are extracted and applied as input to the CE-DLNN, which gives clear haze/fog images. From the classified output, the haze/fog images are applied for the Modified Structure Transference Filtering (MST) to reconstruct the dehaze/defog images. Then, the effectiveness of the proposed system is illustrated. The proposed method achieved a classification accuracy of 97.08%. In comparison with the existing methods, the proposed method has attained better results. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-023-17835-z |