Robust Sparse Hashing
We study Nearest Neighbors (NN) retrieval by introducing a new approach: Robust Sparse Hashing (RSH). Our approach is inspired by the success of dictionary learning for sparse coding; the key innovation is to use learned sparse codes as hashcodes for speeding up NN. But sparse coding suffers from a...
Saved in:
Published in | 2012 19th IEEE International Conference on Image Processing pp. 2417 - 2420 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.09.2012
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We study Nearest Neighbors (NN) retrieval by introducing a new approach: Robust Sparse Hashing (RSH). Our approach is inspired by the success of dictionary learning for sparse coding; the key innovation is to use learned sparse codes as hashcodes for speeding up NN. But sparse coding suffers from a major drawback: when data are noisy or uncertain, for a query point, an exact match of the hashcode seldom happens, breaking the NN retrieval. We tackle this difficulty via our novel dictionary learning and sparse coding framework called RSH by learning dictionaries on the robustified counterparts of uncertain data points. The algorithm is applied to NN retrieval for Scale Invariant Feature Transform (SIFT) descriptors. The results demonstrate that RSH is noise tolerant, and at the same time shows promising NN performance over the state-of-the-art. |
---|---|
ISBN: | 1467325341 9781467325349 |
ISSN: | 1522-4880 |
DOI: | 10.1109/ICIP.2012.6467385 |