Channel-separation-based Network for Object Detection under Foggy Conditions

Vision plays a key role in enhancing environmental awareness in several applications. However, under adverse weather conditions (particularly foggy conditions), it is difficult to locate objects from the captured low-quality images. Most existing methods attempt to restore high-quality images from t...

Full description

Saved in:
Bibliographic Details
Published in2023 IEEE 18th Conference on Industrial Electronics and Applications (ICIEA) pp. 1004 - 1010
Main Authors Liang, Xiaolin, Lai, Tingqin, Luo, Shaobo, Li, Zhengguo, Sun, Shihai
Format Conference Proceeding
LanguageEnglish
Published IEEE 18.08.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Vision plays a key role in enhancing environmental awareness in several applications. However, under adverse weather conditions (particularly foggy conditions), it is difficult to locate objects from the captured low-quality images. Most existing methods attempt to restore high-quality images from the low-quality ones, which increases system complexity and results in the loss of the latent information of the images. In this study, a channel-separation-based detection network is proposed to preserve latent information. In particular, a fog filter is used to perform pruning during image processing to maintain the latent information of the images. By replacing the deep feature extraction layer with a plug-and-play block (MBConvBlock) and using a new CSPBottleNeck combined with CrossConv in the feature pyramid network, our model can overcome the disadvantages of convolutional neural network with fixed and global receptive fields and focus on more crucial object features. The model was trained jointly using an end-to-end method and hybrid data, thereby enhancing the generalization ability of the model network. The results indicate that our model achieves outstanding performance under both normal and real-world foggy conditions with an mAP of 74.40% and 42.10%, respectively.
ISSN:2158-2297
DOI:10.1109/ICIEA58696.2023.10241489