Fine-Grained Visual Comparisons with Local Learning
Given two images, we want to predict which exhibits a particular visual attribute more than the other-even when the two images are quite similar. Existing relative attribute methods rely on global ranking functions; yet rarely will the visual cues relevant to a comparison be constant for all data, n...
Saved in:
Published in | 2014 IEEE Conference on Computer Vision and Pattern Recognition pp. 192 - 199 |
---|---|
Main Authors | , |
Format | Conference Proceeding Journal Article |
Language | English |
Published |
IEEE
01.06.2014
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Given two images, we want to predict which exhibits a particular visual attribute more than the other-even when the two images are quite similar. Existing relative attribute methods rely on global ranking functions; yet rarely will the visual cues relevant to a comparison be constant for all data, nor will humans' perception of the attribute necessarily permit a global ordering. To address these issues, we propose a local learning approach for fine-grained visual comparisons. Given a novel pair of images, we learn a local ranking model on the fly, using only analogous training comparisons. We show how to identify these analogous pairs using learned metrics. With results on three challenging datasets-including a large newly curated dataset for fine-grained comparisons-our method outperforms stateof-the-art methods for relative attribute prediction. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Conference-1 ObjectType-Feature-3 content type line 23 SourceType-Conference Papers & Proceedings-2 |
ISSN: | 1063-6919 1063-6919 2575-7075 |
DOI: | 10.1109/CVPR.2014.32 |