Rebalancing Multi-Label Class-Incremental Learning

Multi-label class-incremental learning (MLCIL) is essential for real-world multi-label applications, allowing models to learn new labels while retaining previously learned knowledge continuously. However, recent MLCIL approaches can only achieve suboptimal performance due to the oversight of the pos...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Du, Kaile, Zhou, Yifan, Lyu, Fan, Li, Yuyang, Xie, Junzhou, Shen, Yixi, Hu, Fuyuan, Liu, Guangcan
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 22.08.2024
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Multi-label class-incremental learning (MLCIL) is essential for real-world multi-label applications, allowing models to learn new labels while retaining previously learned knowledge continuously. However, recent MLCIL approaches can only achieve suboptimal performance due to the oversight of the positive-negative imbalance problem, which manifests at both the label and loss levels because of the task-level partial label issue. The imbalance at the label level arises from the substantial absence of negative labels, while the imbalance at the loss level stems from the asymmetric contributions of the positive and negative loss parts to the optimization. To address the issue above, we propose a Rebalance framework for both the Loss and Label levels (RebLL), which integrates two key modules: asymmetric knowledge distillation (AKD) and online relabeling (OR). AKD is proposed to rebalance at the loss level by emphasizing the negative label learning in classification loss and down-weighting the contribution of overconfident predictions in distillation loss. OR is designed for label rebalance, which restores the original class distribution in memory by online relabeling the missing classes. Our comprehensive experiments on the PASCAL VOC and MS-COCO datasets demonstrate that this rebalancing strategy significantly improves performance, achieving new state-of-the-art results even with a vanilla CNN backbone.
AbstractList Multi-label class-incremental learning (MLCIL) is essential for real-world multi-label applications, allowing models to learn new labels while retaining previously learned knowledge continuously. However, recent MLCIL approaches can only achieve suboptimal performance due to the oversight of the positive-negative imbalance problem, which manifests at both the label and loss levels because of the task-level partial label issue. The imbalance at the label level arises from the substantial absence of negative labels, while the imbalance at the loss level stems from the asymmetric contributions of the positive and negative loss parts to the optimization. To address the issue above, we propose a Rebalance framework for both the Loss and Label levels (RebLL), which integrates two key modules: asymmetric knowledge distillation (AKD) and online relabeling (OR). AKD is proposed to rebalance at the loss level by emphasizing the negative label learning in classification loss and down-weighting the contribution of overconfident predictions in distillation loss. OR is designed for label rebalance, which restores the original class distribution in memory by online relabeling the missing classes. Our comprehensive experiments on the PASCAL VOC and MS-COCO datasets demonstrate that this rebalancing strategy significantly improves performance, achieving new state-of-the-art results even with a vanilla CNN backbone.
Author Shen, Yixi
Xie, Junzhou
Du, Kaile
Liu, Guangcan
Lyu, Fan
Zhou, Yifan
Li, Yuyang
Hu, Fuyuan
Author_xml – sequence: 1
  givenname: Kaile
  surname: Du
  fullname: Du, Kaile
– sequence: 2
  givenname: Yifan
  surname: Zhou
  fullname: Zhou, Yifan
– sequence: 3
  givenname: Fan
  surname: Lyu
  fullname: Lyu, Fan
– sequence: 4
  givenname: Yuyang
  surname: Li
  fullname: Li, Yuyang
– sequence: 5
  givenname: Junzhou
  surname: Xie
  fullname: Xie, Junzhou
– sequence: 6
  givenname: Yixi
  surname: Shen
  fullname: Shen, Yixi
– sequence: 7
  givenname: Fuyuan
  surname: Hu
  fullname: Hu, Fuyuan
– sequence: 8
  givenname: Guangcan
  surname: Liu
  fullname: Liu, Guangcan
BookMark eNqNys0KgkAUQOEhCrLyHYTWA-MdNVtLUWCbcC9XuYUyXWt-3j8XPUCrszjfRix5YlqICLROZZkBrEXs3KiUguIAea4jAXfq0CD3Az-TWzB-kDV2ZJLKoHPyyr2lF7FHk9SElme2E6sHGkfxr1uxP5-a6iLfdvoEcr4dp2B5Xq1WxyLTJaSl_k99AfOPNWA
ContentType Paper
Copyright 2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central Korea
SciTech Premium Collection
ProQuest Engineering Collection
Engineering Database
Publicly Available Content Database
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest One Academic
Engineering Collection
DatabaseTitleList Publicly Available Content Database
Database_xml – sequence: 1
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
ID FETCH-proquest_journals_30964382183
IEDL.DBID 8FG
IngestDate Thu Oct 10 22:06:37 EDT 2024
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-proquest_journals_30964382183
OpenAccessLink https://www.proquest.com/docview/3096438218?pq-origsite=%requestingapplication%
PQID 3096438218
PQPubID 2050157
ParticipantIDs proquest_journals_3096438218
PublicationCentury 2000
PublicationDate 20240822
PublicationDateYYYYMMDD 2024-08-22
PublicationDate_xml – month: 08
  year: 2024
  text: 20240822
  day: 22
PublicationDecade 2020
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2024
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 3.5597436
SecondaryResourceType preprint
Snippet Multi-label class-incremental learning (MLCIL) is essential for real-world multi-label applications, allowing models to learn new labels while retaining...
SourceID proquest
SourceType Aggregation Database
SubjectTerms Labels
Learning
Performance enhancement
Title Rebalancing Multi-Label Class-Incremental Learning
URI https://www.proquest.com/docview/3096438218
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1NSwMxEB20i-CtfqG2lgW9Bncnu9n0JCi7FrGliEJvJclmvRSt3Xr1tzuJqR6EXgIhkC-SecnLywzAVYIqLWo0rNaJZBmahuk010wpQ2hOEKQT98F5PBGjl-xhls8C4dYGWeXGJnpDXb8bx5Ff88R5jpKESDfLD-aiRrnX1RBCYxeiFAvhJH2yuv_lWFAUdGLm_8ysx46qC9FULe3qAHbs2yHsecmlaY8AaWROV2gIPGL_D5Y9Km0XsY9TyWjj_lB3ahEHJ6ivx3BZlc93I7ZpaB6WQjv_6zg_gQ7d6e0pxEpmMm-yRgpKxVBqLmqZcBzatDAN5mfQ31bT-fbiHuwjYa-jPhH70FmvPu0FYedaD_wEDSC6LSfTJ8qNv8pvNoN4mQ
link.rule.ids 780,784,12765,21388,33373,33744,43600,43805
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfZ3PT8MgFMdfdIvRmz_jj6lN9EqkQCk7eTDWqt3iYSa7NUCpl0XnOv9_H8j0YLILFxJaGngf-Pb9ALimTKd5wyxpDFVEMNsSk2aGaG2R5oggQ32A82gsy1fxNM2mUXDrolvlyiYGQ918WK-R33DqM0cpJNLt_JP4qlH-72osobEJfcER3T5SvHj41ViYzPHEzP-Z2cCOYhf6L3ruFnuw4d73YSu4XNruABjOzPsVWoRHEuJgSaWNmyWhTiXBjfsj3elZEpOgvh3CVXE_uSvJ6kF1XApd_ffi_Ah6eKd3x5BoJVTWilZJbOVQGS4bRTkbujS3LctOYLBupNP13ZewXU5GVV09jp_PYIchh70MytgAesvFlztHji7NRfhY31EpeLA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Rebalancing+Multi-Label+Class-Incremental+Learning&rft.jtitle=arXiv.org&rft.au=Du%2C+Kaile&rft.au=Zhou%2C+Yifan&rft.au=Lyu%2C+Fan&rft.au=Li%2C+Yuyang&rft.date=2024-08-22&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422