Single-View 3D reconstruction: A Survey of deep learning methods
•Deep learning really helps in reconstructing 3D shapes from images.•Voxel grids is the most used representation but not the most efficient.•Choice of 3D representation is crucial to the success of single-view reconstruction.•Implicit surfaces gaining traction in Single-view object reconstruction. [...
Saved in:
Published in | Computers & graphics Vol. 94; pp. 164 - 190 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Oxford
Elsevier Ltd
01.02.2021
Elsevier Science Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •Deep learning really helps in reconstructing 3D shapes from images.•Voxel grids is the most used representation but not the most efficient.•Choice of 3D representation is crucial to the success of single-view reconstruction.•Implicit surfaces gaining traction in Single-view object reconstruction.
[Display omitted]
The field of single-view 3D shape reconstruction and generation using deep learning techniques has seen rapid growth in the past five years. As the field is reaching a stage of maturity, a plethora of methods has been continuously proposed with the aim of pushing the state of research further. This article focuses on surveying the literature by classifying these methods according to the shape representation they use as an output. Specifically, it covers each method’s main contributions, degree of supervision, training paradigm, and its relation to the whole body of literature. Additionally, this survey discusses common 3D datasets, loss functions, and evaluation metrics used in the field. Finally, it provides a thorough analysis and reflections on the current state of research and provides a summary of the open problems and possible future directions. This work is an effort to introduce the field of data-driven single-view 3D reconstruction to interested researchers while being comprehensive enough to act as a reference to those who already do research in the field. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0097-8493 1873-7684 |
DOI: | 10.1016/j.cag.2020.12.004 |