Bagged Tree Based Frame-Wise Beforehand Prediction Approach for HEVC Intra-Coding Unit Partitioning

High Efficiency Video Coding (HEVC) has achieved about 50% bit-rates saving compared with its predecessor H.264 standard, while the encoding complexity increases dramatically. Due to the introduction of more flexible partition structures and more optional prediction directions, HEVC takes a brute fo...

Full description

Saved in:
Bibliographic Details
Published inElectronics (Basel) Vol. 9; no. 9; p. 1523
Main Authors Li, Yixiao, Li, Lixiang, Fang, Yuan, Peng, Haipeng, Yang, Yixian
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.09.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:High Efficiency Video Coding (HEVC) has achieved about 50% bit-rates saving compared with its predecessor H.264 standard, while the encoding complexity increases dramatically. Due to the introduction of more flexible partition structures and more optional prediction directions, HEVC takes a brute force approach to find the optimal partitioning result which is much more time consuming. Therefore, this paper proposes a bagged trees based fast approach (BTFA) and focuses on the coding unit (CU) size decision for HEVC intra-coding. First, several key features of a target CU are extracted for three-output classifiers. Then, to avoid feature extraction and prediction time over head, our approach is designed frame-wisely, and the procedure is applied parallel with the encoding process. Using the adaptive threshold determination algorithm, our approach achieves 42.04% time saving with negligible 0.92% Bit-Distortion (BD)-rate loss. Furthermore, in order to calculate the optimal thresholds to balance BD-rate loss and complexity reduction, the neural network based mathematical fitting is added to BTFA, which is called the advanced bagged trees based fast approach (ABTFA). Finally, experimental results show that ABTFA achieves 47.87% time saving with only 0.96% BD-rate loss, which outperforms other state-of-the-art approaches.
ISSN:2079-9292
2079-9292
DOI:10.3390/electronics9091523