Hybrid Model-Based / Data-Driven Graph Transform for Image Coding
Transform coding to sparsify signal representations remains crucial in an image compression pipeline. While the Karhunen-Loève transform (KLT) computed from an empirical covariance matrix {\mathbf{\bar C}} is theoretically optimal for a stationary process, in practice, collecting sufficient statisti...
Saved in:
Published in | Proceedings - International Conference on Image Processing pp. 3667 - 3671 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
16.10.2022
|
Subjects | |
Online Access | Get full text |
ISSN | 2381-8549 |
DOI | 10.1109/ICIP46576.2022.9897653 |
Cover
Summary: | Transform coding to sparsify signal representations remains crucial in an image compression pipeline. While the Karhunen-Loève transform (KLT) computed from an empirical covariance matrix {\mathbf{\bar C}} is theoretically optimal for a stationary process, in practice, collecting sufficient statistics from a non-stationary image to reliably estimate {\mathbf{\bar C}} can be difficult. In this paper, to encode an intra-prediction residual block, we pursue a hybrid model-based / data-driven approach: the first K eigenvectors of a transform matrix are derived from a statistical model, e.g., the asymmetric discrete sine transform (ADST), for stability, while the remaining N −K are computed from {\mathbf{\bar C}} for data adaptivity. The transform computation is posed as a graph learning problem, where we seek a graph Laplacian matrix minimizing a graphical lasso objective inside a convex cone sharing the first K eigenvectors in a Hilbert space of real symmetric matrices. We efficiently solve the problem via augmented Lagrangian relaxation and proximal gradient (PG). Using open-source WebP as a baseline image codec, experimental results show that our hybrid graph transform achieved better coding performance than discrete cosine transform (DCT), ADST and KLT, and better stability than KLT. |
---|---|
ISSN: | 2381-8549 |
DOI: | 10.1109/ICIP46576.2022.9897653 |