Continual Learning of Natural Language Processing Tasks: A Survey

Continual learning (CL) is a learning paradigm that emulates the human capability of learning and accumulating knowledge continually without forgetting the previously learned knowledge and also transferring the learned knowledge to help learn new tasks better. This survey presents a comprehensive re...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Ke, Zixuan, Liu, Bing
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 11.05.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Continual learning (CL) is a learning paradigm that emulates the human capability of learning and accumulating knowledge continually without forgetting the previously learned knowledge and also transferring the learned knowledge to help learn new tasks better. This survey presents a comprehensive review and analysis of the recent progress of CL in NLP, which has significant differences from CL in computer vision and machine learning. It covers (1) all CL settings with a taxonomy of existing techniques; (2) catastrophic forgetting (CF) prevention, (3) knowledge transfer (KT), which is particularly important for NLP tasks; and (4) some theory and the hidden challenge of inter-task class separation (ICS). (1), (3) and (4) have not been included in the existing survey. Finally, a list of future directions is discussed.
ISSN:2331-8422