L-gem based co-training for CBIR with relevance feedback

Relevance feedback has been developed for several years and becomes an effective method for capturing userpsilas concepts to improve the performance of content-based image retrieval (CBIR). In contrast to fully labeled training dataset in supervised learning, semi-supervised learning and active lear...

Full description

Saved in:
Bibliographic Details
Published in2008 International Conference on Wavelet Analysis and Pattern Recognition Vol. 2; pp. 873 - 879
Main Authors Tao Zhu, Wing Ng, Lee, J., Bin-Bin Sun, Jun Wang, Yeung, D.S.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2008
Subjects
Online AccessGet full text
ISBN9781424422388
1424422388
ISSN2158-5695
DOI10.1109/ICWAPR.2008.4635899

Cover

More Information
Summary:Relevance feedback has been developed for several years and becomes an effective method for capturing userpsilas concepts to improve the performance of content-based image retrieval (CBIR). In contrast to fully labeled training dataset in supervised learning, semi-supervised learning and active learning deal with training dataset with only a small portion of labeled samples. This is more realistic because one could easily find thousands of unlabeled images from the Internet. How to make use of such unlabeled resources on the Internet is an important research topic. Co-training method is to expand the number of labeled samples in semi-supervised learning by swapping training samples between two classifiers. In this work, we propose to apply the localized generalization error model (L-GEM) to co-training. Two radial basis function neural networks (RBFNN) with different features split is adopted in the co-training and the unlabeled samples with lowest L-GEM value is added to the co-training in next iteration. In the CBIR system, we output those positive images with lowest L-GEM value as the highest confident image and output those images with highest L-GEM to ask for user labeling. Higher the L-GEM value of a sample is, the less confident is the classifier to predict its image class. Experimental results show that the proposed method could effectively improve the image retrieval results.
ISBN:9781424422388
1424422388
ISSN:2158-5695
DOI:10.1109/ICWAPR.2008.4635899