Efficient and High-Quality Monocular Depth Estimation via Gated Multi-Scale Network
The key issue in monocular depth estimation is how to construct the depth image better and improve the quality of the depth map. At present, most of the monocular depth estimation methods based on deep learning manipulate images at low resolution that leads to loss of detail and blurring of boundari...
Saved in:
Published in | IEEE access Vol. 8; pp. 7709 - 7718 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The key issue in monocular depth estimation is how to construct the depth image better and improve the quality of the depth map. At present, most of the monocular depth estimation methods based on deep learning manipulate images at low resolution that leads to loss of detail and blurring of boundaries. Nevertheless, deep learning with a large number of parameters needs highly computational complexity, which makes it difficult to apply high-resolution (HR) images to the depth estimate. In this work, model accuracy and runtime are two important factors to be considered. To improve the depth map quality and reduce the running time of the network, we introduce super-resolution techniques as methods of up-sampling to generate high-quality depth images at a faster rate for the depth estimation network. A novel approach is proposed for collecting high-level features that are captured under different receptive fields. The gated multi-scale decoder allows us to effectively filter information by the gated module. By combining the gated module to aid the super resolution of depth images, our method reduces memory consumption while improves reconstruction quality. Experiment results on the challenging NYU Depth v2 dataset demonstrate that both contributions provide significant performance gains over the state-of-the-art in self-supervised depth estimation. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.2964733 |