Parametric and nonparametric residual vector quantization optimizations for ANN search

For approximate nearest neighbor (ANN) search in many vision-based applications, vector quantization (VQ) is an efficient compact encoding technology. A representative approach of VQ is product quantization (PQ) which quantizes subspaces separately by Cartesian product and achieves high accuracy. Bu...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 217; pp. 92 - 102
Main Authors Guo, Dan, Li, Chuanqing, Wu, Lv
Format Journal Article
LanguageEnglish
Published Elsevier B.V 12.12.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For approximate nearest neighbor (ANN) search in many vision-based applications, vector quantization (VQ) is an efficient compact encoding technology. A representative approach of VQ is product quantization (PQ) which quantizes subspaces separately by Cartesian product and achieves high accuracy. But its space decomposition still leads to quantization distortion. This paper presents two optimized solutions based on residual vector quantization (RVQ). Different from PQ, RVQ simulates restoring quantization error by multi-stage quantizers instead of decomposing it. To further optimize codebook and space decomposition, we try to get a better discriminated space projection. Then an orthonormal matrix R is generated. The RVQ's nonparametric solution alternately optimizes R and stage-codebooks by Singular Value Decomposition (SVD) in multiple iterations. The RVQ's parametric solution assumes that data are subject to Gaussian distribution and uses Eigenvalue Allocation to get each stage-matrix {Rl}(1≤l≤L) at once, where L is the stage number of RVQ. Compared to various optimized PQ-based methods, our methods have good superiority on restoring quantization error.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2016.04.061