Exploiting deep textures for image retrieval

Deep features and texture features each have advantages in image representation. However, exploiting deep textures for image retrieval is challenging because it is difficult to enhance the compatibility of texture features and deep features. To address this problem, we propose a novel image-retrieva...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of machine learning and cybernetics Vol. 14; no. 2; pp. 483 - 494
Main Authors Liu, Guang-Hai, Yang, Jing-Yu
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.02.2023
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep features and texture features each have advantages in image representation. However, exploiting deep textures for image retrieval is challenging because it is difficult to enhance the compatibility of texture features and deep features. To address this problem, we propose a novel image-retrieval method named the deep texture feature histogram (DTFH). The main highlights are: (1) We propose a novel method for identifying effective, limited-effectiveness, or non-valid feature maps via ranking based on Haralick’s statistics, which can help understand image content and identify objects, as these statistics have clear physical significance. (2) We use Gabor filtering to mimic the human orientation-selection mechanism, which allows deep texture features to contain a good representation of orientation, thereby enhancing discriminative power. (3) We combine the advantages of classical texture features and deep features to provide a compact representation. This provides a new, yet simple way to exploit deep features via the use of traditional classical texture features. Comparative experiments demonstrate that deep texture features provide highly competitive performance in image retrieval in terms of mean average precision (mAP), and provide new insights into the exploitation of traditional texture features and deep features.
ISSN:1868-8071
1868-808X
DOI:10.1007/s13042-022-01645-0