Multi-Scale Cost Attention and Adaptive Fusion Stereo Matching Network

At present, compared to 3D convolution, 2D convolution is less computationally expensive and faster in stereo matching methods based on convolution. However, compared to the initial cost volume generated by calculation using a 3D convolution method, the initial cost volume generated by 2D convolutio...

Full description

Saved in:
Bibliographic Details
Published inElectronics (Basel) Vol. 12; no. 7; p. 1594
Main Authors Liu, Zhenguo, Li, Zhao, Ao, Wengang, Zhang, Shaoshuang, Liu, Wenlong, He, Yizhi
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.04.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:At present, compared to 3D convolution, 2D convolution is less computationally expensive and faster in stereo matching methods based on convolution. However, compared to the initial cost volume generated by calculation using a 3D convolution method, the initial cost volume generated by 2D convolution in the relevant layer lacks rich information, resulting in the area affected by illumination in the disparity map having a lower robustness and thus affecting its accuracy. Therefore, to address the lack of rich cost volume information in the 2D convolution method, this paper proposes a multi-scale adaptive cost attention and adaptive fusion stereo matching network (MCAFNet) based on AANet+. Firstly, the extracted features are used for initial cost calculation, and the cost volume is input into the multi-scale adaptive cost attention module to generate attention weight, which is then combined with the initial cost volume to suppress irrelevant information and enrich the cost volume. Secondly, the cost aggregation part of the model is improved. A multi-scale adaptive fusion module is added to improve the fusion efficiency of cross-scale cost aggregation. In the Scene Flow dataset, the EPE is reduced to 0.66. The error matching rates in the KITTI2012 and KITTI2015 datasets are 1.60% and 2.22%, respectively.
ISSN:2079-9292
2079-9292
DOI:10.3390/electronics12071594