Motion and disparity vectors early determination for texture video in 3D-HEVC
3D-HEVC is the state-of-the-art video coding standard for 3D video, and it is an extension of high efficiency video coding (HEVC) standard. Besides the original HEVC coding tools, 3D-HEVC adopts some advanced coding tools, such as disparity vector (DV), inter-view prediction and inter-component pred...
Saved in:
Published in | Multimedia tools and applications Vol. 79; no. 7-8; pp. 4297 - 4314 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.02.2020
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | 3D-HEVC is the state-of-the-art video coding standard for 3D video, and it is an extension of high efficiency video coding (HEVC) standard. Besides the original HEVC coding tools, 3D-HEVC adopts some advanced coding tools, such as disparity vector (DV), inter-view prediction and inter-component prediction. However, these advanced tools lead to extremely high encoding complexity at the same time, thus it cannot be well applied in real-time multimedia systems. In this paper, we propose a motion and disparity vectors early determination algorithm to reduce 3D-HEVC computational complexity. First, based on the statistical analyses, the spatial and temporal motion vector (MV) candidates are adaptively reduced for the prediction unit (PU) with the Merge mode. Then, for the PU with the Inter mode, the combination of spatial and temporal candidates is used to early determine the final MV. Finally, an adaptive optimization algorithm is adopted to select the valid inter-view disparity vectors (DV) candidates. Moreover, if the difference between candidate vectors is within a conditional range, current PU will be encoded with the Merge mode to skip unnecessary coding process. Experimental results show that for the texture views encoding, the proposed algorithm achieves an average of 33.03% encoding time saving, and an average of 0.47% BD-Rate increases. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-018-6830-7 |