Deep learning to ternary hash codes by continuation

Recently, ithas been observed that {0,±1}‐ternary codes, which are simply generated from deep features by hard thresholding, tend to outperform {−1,1}‐binary codes in image retrieval. To obtain better ternary codes, the authors for the first time propose to jointly learn the features with the codes...

Full description

Saved in:
Bibliographic Details
Published inElectronics letters Vol. 57; no. 24; pp. 925 - 926
Main Authors Chen, Mingrui, Li, Weiyu, Lu, Weizhi
Format Journal Article
LanguageEnglish
Published Stevenage John Wiley & Sons, Inc 01.11.2021
Wiley
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently, ithas been observed that {0,±1}‐ternary codes, which are simply generated from deep features by hard thresholding, tend to outperform {−1,1}‐binary codes in image retrieval. To obtain better ternary codes, the authors for the first time propose to jointly learn the features with the codes by appending a smoothed function to the networks. During training, the function could evolve into a non‐smoothed ternary function by a continuation method, and then generate ternary codes. The method circumvents the difficulty of directly training discrete functions and reduces the quantization errors of ternary codes. Experiments show that the proposed joint learning indeed could produce better ternary codes. For the first time, the authors propose to generate ternary hash codes by jointly learning the codes with deep features via a continuation method. Experiments show that the proposed method outperforms existing methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0013-5194
1350-911X
DOI:10.1049/ell2.12317