Dynamic estimator selection for double‐bit‐range estimation in VVC CABAC entropy coding

CABAC is the only entropy coding used in Versatile Video Coding (VVC). This is achieved through multiple estimators approach that provide more accurate predictions by considering different estimated probability results, but CABAC coding requires higher complexity and bit‐range accuracy than other ap...

Full description

Saved in:
Bibliographic Details
Published inIET image processing Vol. 18; no. 3; pp. 722 - 730
Main Authors Im, Sio‐Kei, Chan, Ka‐Hou
Format Journal Article
LanguageEnglish
Published Wiley 01.02.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:CABAC is the only entropy coding used in Versatile Video Coding (VVC). This is achieved through multiple estimators approach that provide more accurate predictions by considering different estimated probability results, but CABAC coding requires higher complexity and bit‐range accuracy than other approaches. Therefore, there is more potential to refine the performance from the perspective of bit allocation and architecture design. In this paper, a selection method is proposed to determine which estimator is recommended to dynamically perform the current entropy coding. Taking advantage of the double‐bit‐range architecture, the bits contained in the different estimators are also rearranged based on Most Probable Symbol (MPS$\mathit {MPS}$) determination and Least Probable Symbol (LPS$\mathit {LPS}$) considerations. Experimental reports in the work report that the coding time can be reduced using the proposed method and there is a slight gain in Peak Signal‐to‐Noise Ratio (PSNR) while saving some Rate‐distortion (RD) performance in bitrate. A selection method to determine which estimator is recommended to dynamically perform the current entropy coding. Taking advantage of the double‐bit‐range architecture, we also rearrange the bits contained in the different estimators based on MPS determination and Least Probable Symbol LPS considerations.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12980