Image classification via convolutional sparse coding
The Convolutional Sparse Coding (CSC) model has recently attracted a lot of attention in the signal and image processing communities. Since, in traditional sparse coding methods, a significant assumption is that all input samples are independent, so it is not well for most dependent works. In such c...
Saved in:
Published in | The Visual computer Vol. 39; no. 5; pp. 1731 - 1744 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.05.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The Convolutional Sparse Coding (CSC) model has recently attracted a lot of attention in the signal and image processing communities. Since, in traditional sparse coding methods, a significant assumption is that all input samples are independent, so it is not well for most dependent works. In such cases, CSC models are a good choice. In this paper, we proposed a novel CSC-based classification model which combines the local block coordinate descent (LoBCoD) algorithm with the classification strategy. For this, in the training phase, the convolutional dictionary atoms (filters) of each class are learned by all training samples of the same class. In the test phase, the label of the query sample can be determined based on the reconstruction error of the filters related to every subject. Experimental results on five benchmark databases at the different number of training samples clearly demonstrate the superiority of our method to many state-of-the-art classification methods. Besides, we have shown that our method is less dependent on the number of training samples and therefore it can better work than other methods in small databases with fewer samples. For instance, increases of 26.27%, 18.32%, 11.35%, 13.5%, and 19.3% in recognition rates are observed for our method when compared to conventional SRC for five used databases at the least number of training samples per class. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0178-2789 1432-2315 |
DOI: | 10.1007/s00371-022-02441-1 |