Margin Contrastive Learning with Learnable-Vector for Continual Learning
In continual learning, there is a serious problem "catastrophic forgetting", in which previously acquired knowledge is forgotten when a new task is learned. Various methods have been proposed to solve this problem. Among them, Replay methods, which store a portion of the past training data...
Saved in:
Published in | 2023 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW) pp. 3562 - 3568 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
02.10.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In continual learning, there is a serious problem "catastrophic forgetting", in which previously acquired knowledge is forgotten when a new task is learned. Various methods have been proposed to solve this problem. Among them, Replay methods, which store a portion of the past training data and regenerate it for later tasks, have shown excellent performance. In this paper, we propose a new online continuous learning method that adds a representative vector for each class and a margin for similarity computation to the conventional method, Supervised Contrastive Replay (SCR). Our method aims to mitigate the catastrophic forgetting caused by class imbalance by using learnable vectors of each class and adding a margin to the calculation of similarity. Experiments on multiple image classification datasets confirm that our method outperformed conventional methods. |
---|---|
ISSN: | 2473-9944 |
DOI: | 10.1109/ICCVW60793.2023.00383 |