Robust Sparse Hashing

We study Nearest Neighbors (NN) retrieval by introducing a new approach: Robust Sparse Hashing (RSH). Our approach is inspired by the success of dictionary learning for sparse coding; the key innovation is to use learned sparse codes as hashcodes for speeding up NN. But sparse coding suffers from a...

Full description

Saved in:
Bibliographic Details
Published in2012 19th IEEE International Conference on Image Processing pp. 2417 - 2420
Main Authors Cherian, A., Morellas, V., Papanikolopoulos, N.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2012
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We study Nearest Neighbors (NN) retrieval by introducing a new approach: Robust Sparse Hashing (RSH). Our approach is inspired by the success of dictionary learning for sparse coding; the key innovation is to use learned sparse codes as hashcodes for speeding up NN. But sparse coding suffers from a major drawback: when data are noisy or uncertain, for a query point, an exact match of the hashcode seldom happens, breaking the NN retrieval. We tackle this difficulty via our novel dictionary learning and sparse coding framework called RSH by learning dictionaries on the robustified counterparts of uncertain data points. The algorithm is applied to NN retrieval for Scale Invariant Feature Transform (SIFT) descriptors. The results demonstrate that RSH is noise tolerant, and at the same time shows promising NN performance over the state-of-the-art.
ISBN:1467325341
9781467325349
ISSN:1522-4880
DOI:10.1109/ICIP.2012.6467385