VLP: A Survey on Vision-language Pre-training

In the past few years, the emergence of pre-training models has brought uni-modal fields such as computer vision (CV) and natural language processing (NLP) to a new era. Substantial works have shown that they are beneficial for downstream uni-modal tasks and avoid training a new model from scratch....

Full description

Saved in:
Bibliographic Details
Published inInternational journal of automation and computing Vol. 20; no. 1; pp. 38 - 56
Main Authors Chen, Fei-Long, Zhang, Du-Zhen, Han, Ming-Lun, Chen, Xiu-Yi, Shi, Jing, Xu, Shuang, Xu, Bo
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.02.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN2731-538X
1476-8186
2153-182X
2731-5398
1751-8520
2153-1838
DOI10.1007/s11633-022-1369-5

Cover

Loading…
More Information
Summary:In the past few years, the emergence of pre-training models has brought uni-modal fields such as computer vision (CV) and natural language processing (NLP) to a new era. Substantial works have shown that they are beneficial for downstream uni-modal tasks and avoid training a new model from scratch. So can such pre-trained models be applied to multi-modal tasks? Researchers have explored this problem and made significant progress. This paper surveys recent advances and new frontiers in vision-language pre-training (VLP), including image-text and video-text pre-training. To give readers a better overall grasp of VLP, we first review its recent advances in five aspects: feature extraction, model architecture, pre-training objectives, pre-training datasets, and downstream tasks. Then, we summarize the specific VLP models in detail. Finally, we discuss the new frontiers in VLP. To the best of our knowledge, this is the first survey focused on VLP. We hope that this survey can shed light on future research in the VLP field.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2731-538X
1476-8186
2153-182X
2731-5398
1751-8520
2153-1838
DOI:10.1007/s11633-022-1369-5