Perceptual losses for self-supervised depth estimation
Abstract Convolution neural network has shown excellent results in stereo and monocular disparity estimation, while most of the existing methods convert the image depth prediction problem into the image reconstruction problem, and calculate the depth of each pixel through the disparity between the g...
Saved in:
Published in | Journal of physics. Conference series Vol. 1952; no. 2; pp. 22040 - 22047 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Bristol
IOP Publishing
01.06.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Abstract
Convolution neural network has shown excellent results in stereo and monocular disparity estimation, while most of the existing methods convert the image depth prediction problem into the image reconstruction problem, and calculate the depth of each pixel through the disparity between the generated left and right images. However, in the reconstruction task, the loss is still calculated at the pixel level when comparing the reconstructed picture with the original picture, which will greatly affect the estimation of picture depth due to the problems of illumination and occlusion. Therefore, when calculating the loss of image reconstruction, it is very important to compare the higher-level features extracted from the reconstructed image with the original image. In this paper, based on the existing methods, we have innovated the loss function and introduced perceptual loss, i.e., we use feedforward neural network to extract features to further evaluate the reconstructed image, to make the reconstruction loss of baseline more accurate and improve the accuracy and robustness of the depth prediction model. To compare the improved effect, we performed extensive experiments on KITTI driving data by the improved model set, and the experimental index obtains better performance than the original baseline model. |
---|---|
ISSN: | 1742-6588 1742-6596 |
DOI: | 10.1088/1742-6596/1952/2/022040 |