SID: Incremental learning for anchor-free object detection via Selective and Inter-related Distillation

Incremental learning requires a model to continually learn new tasks from streaming data. However, traditional fine-tuning of a well-trained deep neural network on a new task will dramatically degrade performance on the old task — a problem known as catastrophic forgetting. In this paper, we address...

Full description

Saved in:
Bibliographic Details
Published inComputer vision and image understanding Vol. 210; p. 103229
Main Authors Peng, Can, Zhao, Kun, Maksoud, Sam, Li, Meng, Lovell, Brian C.
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.09.2021
Subjects
Online AccessGet full text
ISSN1077-3142
1090-235X
DOI10.1016/j.cviu.2021.103229

Cover

Loading…
More Information
Summary:Incremental learning requires a model to continually learn new tasks from streaming data. However, traditional fine-tuning of a well-trained deep neural network on a new task will dramatically degrade performance on the old task — a problem known as catastrophic forgetting. In this paper, we address this issue in the context of anchor-free object detection, which is a new trend in computer vision as it is simple, fast, and flexible. Simply adapting current incremental learning strategies fails on these anchor-free detectors due to lack of consideration of their specific model structures. To deal with the challenges of incremental learning on anchor-free object detectors, we propose a novel incremental learning paradigm called Selective and Inter-related Distillation (SID). In addition, a novel evaluation metric is proposed to better assess the performance of detectors under incremental learning conditions. By selective distilling at the proper locations and further transferring additional instance relation knowledge, our method demonstrates significant advantages on the benchmark datasets PASCAL VOC and COCO. •We explore incremental detection on anchor-free fully convolutional object detectors.•A selective and inter-related distillation strategy is proposed.•A new evaluation metric is proposed to better evaluate incremental detection results.•We demonstrate the superior performance of our method on benchmark datasets.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2021.103229