CNN-based ternary tree partition approach for VVC intra-QTMT coding
In July 2020, the Joint Video Experts Team has published the versatile video coding (VVC) standard. The VVC encoder enhances the coding efficiency compared with his predecessor high-efficiency video coding encoder, thanks to the improved coding modules and the new proposed techniques such as the new...
Saved in:
Published in | Signal, image and video processing Vol. 18; no. 4; pp. 3587 - 3594 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
London
Springer London
01.06.2024
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In July 2020, the Joint Video Experts Team has published the versatile video coding (VVC) standard. The VVC encoder enhances the coding efficiency compared with his predecessor high-efficiency video coding encoder, thanks to the improved coding modules and the new proposed techniques such as the new block partitioning structure called quadtree with nested multi-type tree (QTMT). However, QTMT induces a significant increase in encoding time mainly at the rate distortion optimization level (RDO) which causes an enormous computational complexity. Instead of RDO-QTMT partition process, a deep-QTMT partition approach based on a fast convolution neural network-ternary tree (CNN-TT) is proposed to predict the best intra-QTMT decision tree in order to reduce the encoding time. A database is initially established containing CU-based TT partition depths with several video contents. Then, a CNN-TT model is developed under three-levels provided by the TT structure to early determine the QTMT partition at 32
×
32. Different threshold values are fixed for each level according to the CNN-TT predicted probabilities to reach a balance between the encoding complexity and the coding efficiency. The experimental results prove that our deep-QTMT partition approach saves a significant encoder time on average between 23% and 58% with an acceptable RD performance. |
---|---|
ISSN: | 1863-1703 1863-1711 |
DOI: | 10.1007/s11760-024-03023-5 |