Leveraging Implicit Relative Labeling-Importance Information for Effective Multi-label Learning

In multi-label learning, each training example is represented by a single instance while associated with multiple labels, and the task is to predict a set of relevant labels for the unseen instance. Existing approaches learn from multi-label data by assuming equal labeling-importance, i.e. all the a...

Full description

Saved in:
Bibliographic Details
Published inProceedings (IEEE International Conference on Data Mining) pp. 251 - 260
Main Authors Li, Yu-Kun, Zhang, Min-Ling, Geng, Xin
Format Conference Proceeding Journal Article
LanguageEnglish
Published IEEE 01.11.2015
Subjects
Online AccessGet full text
ISSN1550-4786
DOI10.1109/ICDM.2015.41

Cover

Loading…
Abstract In multi-label learning, each training example is represented by a single instance while associated with multiple labels, and the task is to predict a set of relevant labels for the unseen instance. Existing approaches learn from multi-label data by assuming equal labeling-importance, i.e. all the associated labels are regarded to be relevant while their relative importance for the training example are not differentiated. Nonetheless, this assumption fails to reflect the fact that the importance degree of each associated label is generally different, though the importance information is not explicitly accessible from the training examples. In this paper, we show that effective multi-label learning can be achieved by leveraging the implicit relative labeling-importance (RLI) information. Specifically, RLI degrees are formalized as multinomial distribution over the label space, which are estimated by adapting an iterative label propagation procedure. After that, the multi-label prediction model is learned by fitting the estimated multinomial distribution as regularized with popular multi-label empirical loss. Comprehensive experiments clearly validate the usefulness of leveraging implicit RLI information to learn from multi-label data.
AbstractList In multi-label learning, each training example is represented by a single instance while associated with multiple labels, and the task is to predict a set of relevant labels for the unseen instance. Existing approaches learn from multi-label data by assuming equal labeling-importance, i.e. all the associated labels are regarded to be relevant while their relative importance for the training example are not differentiated. Nonetheless, this assumption fails to reflect the fact that the importance degree of each associated label is generally different, though the importance information is not explicitly accessible from the training examples. In this paper, we show that effective multi-label learning can be achieved by leveraging the implicit relative labeling-importance (RLI) information. Specifically, RLI degrees are formalized as multinomial distribution over the label space, which are estimated by adapting an iterative label propagation procedure. After that, the multi-label prediction model is learned by fitting the estimated multinomial distribution as regularized with popular multi-label empirical loss. Comprehensive experiments clearly validate the usefulness of leveraging implicit RLI information to learn from multi-label data.
Author Yu-Kun Li
Xin Geng
Min-Ling Zhang
Author_xml – sequence: 1
  givenname: Yu-Kun
  surname: Li
  fullname: Li, Yu-Kun
– sequence: 2
  givenname: Min-Ling
  surname: Zhang
  fullname: Zhang, Min-Ling
– sequence: 3
  givenname: Xin
  surname: Geng
  fullname: Geng, Xin
BookMark eNotj0tLAzEUhSNUsNbu3LnJ0s3UZPJeSq06MEUQXQ_JzJ0SyGTqPAr-e6N1dQ9837lwrtEi9hEQuqVkQykxD8X2ab_JCRUbTi_Q2ihNuVTMCML0Ai2pECTjSssrtB5H70guleSGqCWqSjjBYA8-HnDRHYOv_YTfIdjJnwCX1kFIKEuoHyYba8BFbPuhS7yPOCW8a1uo_-z9HCafhd8OLsEOMTVv0GVrwwjr_7tCn8-7j-1rVr69FNvHMvM50VPWyJqBa0TuDJWC2wYaZ2tHhLZMK20NN5LJtIpa1zpHeZNGOA1gQIFtOFuh-_Pf49B_zTBOVefHGkKwEfp5rKgyLBcm5zKpd2fVA0B1HHxnh-9KMcVYbtgPX4Fllg
CODEN IEEPAD
ContentType Conference Proceeding
Journal Article
DBID 6IE
6IL
CBEJK
RIE
RIL
7SC
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/ICDM.2015.41
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
DatabaseTitleList
Computer and Information Systems Abstracts
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISBN 9781467395038
146739503X
9781467395045
1467395048
EndPage 260
ExternalDocumentID 7373329
Genre orig-research
GroupedDBID 29O
6IE
6IF
6IH
6IK
6IL
6IN
AAJGR
AAWTH
ABLEC
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IEGSK
IPLJI
M43
OCL
RIE
RIL
RNS
7SC
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-i208t-d6c3ebd52b91654adedbacb058a3878a9496368141abfbb14db02b8ee9e7ead43
IEDL.DBID RIE
ISSN 1550-4786
IngestDate Fri Jul 11 04:10:39 EDT 2025
Wed Aug 27 02:34:10 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i208t-d6c3ebd52b91654adedbacb058a3878a9496368141abfbb14db02b8ee9e7ead43
Notes ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Conference-1
ObjectType-Feature-3
content type line 23
SourceType-Conference Papers & Proceedings-2
PQID 1793259246
PQPubID 23500
PageCount 10
ParticipantIDs ieee_primary_7373329
proquest_miscellaneous_1793259246
PublicationCentury 2000
PublicationDate 20151101
PublicationDateYYYYMMDD 2015-11-01
PublicationDate_xml – month: 11
  year: 2015
  text: 20151101
  day: 01
PublicationDecade 2010
PublicationTitle Proceedings (IEEE International Conference on Data Mining)
PublicationTitleAbbrev ICDM
PublicationYear 2015
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssib026764907
ssj0036630
Score 2.1072285
Snippet In multi-label learning, each training example is represented by a single instance while associated with multiple labels, and the task is to predict a set of...
SourceID proquest
ieee
SourceType Aggregation Database
Publisher
StartPage 251
SubjectTerms Accessibility
Conferences
Data mining
Estimation
label distribution
Labels
Learning
Mathematical models
multi-label learning
Predictive models
relative labeling-importance
Reliability
Semantics
Symmetric matrices
Tasks
Training
Yttrium
Title Leveraging Implicit Relative Labeling-Importance Information for Effective Multi-label Learning
URI https://ieeexplore.ieee.org/document/7373329
https://www.proquest.com/docview/1793259246
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwELZKJ6YCLaK8ZCRGnKbxI_ZcqFrUIgYqdYvs5IoqoEWQMPDrsZ0HEjCwebAjy3e-R_x9dwhdcqNt3MkzEqaGEyZkSqQCIEJDxiPGaeSLPc_vxGTBbpd82UJXDRcGADz4DAI39G_52TYt3K-yQUxjSiO1g3Zs4lZytWrdiUQsmAqbZItaT-rJkDYCJyyWogG9q8F0dD13oC4euD7wvqnKL0vs3cu4g-b1xkpUyVNQ5CZIP3_UbPzvzvdQ75vIh-8bF7WPWrA5QJ26kwOuLnYXJTOwKu0bFuGpx5ivc1zi5D4Az7TxrHUyffHRuvtqRWNyYsV2hMsqyG62p_SSZ7cGV9VbH3toMb55GE1I1XqBrKNQ5iQTKQVjpWWUozvpDDKjUxNyqamMpVbMXlwhh2yozcqYIcusCIwEUBBb3WT0ELU32w0cIcxCBiu2ktROZRykEUowCY7wa01rqvuo684reS2rayTVUfXRRS2RxGq8e8bQG9gW74kzKTZpi5g4_nvpCdp10i0Zg6eonb8VcGZDh9yce535AjA4w4E
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED7xGGDi0SLeGIkRlzR-xJmBqoEWMbQSW2QnV1QBLYKEgV-P7SRFAgY2D-fI8p3vzvH33QGcCaNt3ilyGmRGUC5VRlWMSKXGXIRcsNAXex7eyf6Y3zyIhyU4X3BhENGDz7Djhv4tP59npftVdhGxiLEwXoZV4ci4FVursZ5QRpLHweK6xWws9XRIK0l5pOQC9h5fJJdXQwfrEh3XCd63Vfnli32A6W3AsFlahSt56pSF6WSfP6o2_nftm9D-pvKR-0WQ2oIlnG3DRtPLgdRHuwXpAK1R-5ZFJPEo82lBKqTcB5KBNp63TpMXn6-7r9ZEJqdYYkekqoPspD2plz67OaSu3_rYhnHvenTZp3XzBToNA1XQXGYMjdWXiR3hSeeYG52ZQCjNVKR0zO3RlarLu9pMjOny3KrAKMQYI2udnO3Aymw-w10gPOA44RPFrCgXqIyMJVfoKL_WuWZ6D1puv9LXqr5GWm_VHpw2GkmtzbuHDD3DefmeOqdir20hl_t_Tz2Btf5oOEgHyd3tAaw7TVf8wUNYKd5KPLKJRGGOvf18AYsTxsk
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=Proceedings+%28IEEE+International+Conference+on+Data+Mining%29&rft.atitle=Leveraging+Implicit+Relative+Labeling-Importance+Information+for+Effective+Multi-label+Learning&rft.au=Yu-Kun+Li&rft.au=Min-Ling+Zhang&rft.au=Xin+Geng&rft.date=2015-11-01&rft.pub=IEEE&rft.issn=1550-4786&rft.spage=251&rft.epage=260&rft_id=info:doi/10.1109%2FICDM.2015.41&rft.externalDocID=7373329
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1550-4786&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1550-4786&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1550-4786&client=summon