Learning More Universal Representations for Transfer-Learning

A representation is supposed universal if it encodes any element of the visual world (e.g., objects, scenes) in any configuration (e.g., scale, context). While not expecting pure universal representations, the goal in the literature is to improve the universality level, starting from a representatio...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 42; no. 9; pp. 2212 - 2224
Main Authors Tamaazousti, Youssef, Le Borgne, Herve, Hudelot, Celine, Seddik, Mohamed-El-Amine, Tamaazousti, Mohamed
Format Journal Article
LanguageEnglish
Published United States IEEE 01.09.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Institute of Electrical and Electronics Engineers
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A representation is supposed universal if it encodes any element of the visual world (e.g., objects, scenes) in any configuration (e.g., scale, context). While not expecting pure universal representations, the goal in the literature is to improve the universality level, starting from a representation with a certain level. To improve that universality level, one can diversify the source-task, but it requires many additive annotated data that is costly in terms of manual work and possible expertise. We formalize such a diversification process then propose two methods to improve the universality of CNN representations that limit the need for additive annotated data. The first relies on human categorization knowledge and the second on re-training using fine-tuning. We propose a new aggregating metric to evaluate the universality in a transfer-learning scheme, that addresses more aspects than previous works. Based on it, we show the interest of our methods on 10 target-problems, relating to classification on a variety of visual domains.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0162-8828
1939-3539
2160-9292
DOI:10.1109/TPAMI.2019.2913857