Multi-Target Regression via Robust Low-Rank Learning
Multi-target regression has recently regained great popularity due to its capability of simultaneously learning multiple relevant regression tasks and its wide applications in data mining, computer vision and medical image analysis, while great challenges arise from jointly handling inter-target cor...
Saved in:
Published in | IEEE transactions on pattern analysis and machine intelligence Vol. 40; no. 2; pp. 497 - 504 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.02.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Multi-target regression has recently regained great popularity due to its capability of simultaneously learning multiple relevant regression tasks and its wide applications in data mining, computer vision and medical image analysis, while great challenges arise from jointly handling inter-target correlations and input-output relationships. In this paper, we propose Multi-layer Multi-target Regression (MMR) which enables simultaneously modeling intrinsic inter-target correlations and nonlinear input-output relationships in a general framework via robust low-rank learning. Specifically, the MMR can explicitly encode inter-target correlations in a structure matrix by matrix elastic nets (MEN); the MMR can work in conjunction with the kernel trick to effectively disentangle highly complex nonlinear input-output relationships; the MMR can be efficiently solved by a new alternating optimization algorithm with guaranteed convergence. The MMR leverages the strength of kernel methods for nonlinear feature learning and the structural advantage of multi-layer learning architectures for inter-target correlation modeling. More importantly, it offers a new multi-layer learning paradigm for multi-target regression which is endowed with high generality, flexibility and expressive ability. Extensive experimental evaluation on 18 diverse real-world datasets demonstrates that our MMR can achieve consistently high performance and outperforms representative state-of-the-art algorithms, which shows its great effectiveness and generality for multivariate prediction. |
---|---|
AbstractList | Multi-target regression has recently regained great popularity due to its capability of simultaneously learning multiple relevant regression tasks and its wide applications in data mining, computer vision and medical image analysis, while great challenges arise from jointly handling inter-target correlations and input-output relationships. In this paper, we propose Multi-layer Multi-target Regression (MMR) which enables simultaneously modeling intrinsic inter-target correlations and nonlinear input-output relationships in a general framework via robust low-rank learning. Specifically, the MMR can explicitly encode inter-target correlations in a structure matrix by matrix elastic nets (MEN); the MMR can work in conjunction with the kernel trick to effectively disentangle highly complex nonlinear input-output relationships; the MMR can be efficiently solved by a new alternating optimization algorithm with guaranteed convergence. The MMR leverages the strength of kernel methods for nonlinear feature learning and the structural advantage of multi-layer learning architectures for inter-target correlation modeling. More importantly, it offers a new multi-layer learning paradigm for multi-target regression which is endowed with high generality, flexibility and expressive ability. Extensive experimental evaluation on 18 diverse real-world datasets demonstrates that our MMR can achieve consistently high performance and outperforms representative state-of-the-art algorithms, which shows its great effectiveness and generality for multivariate prediction. Multi-target regression has recently regained great popularity due to its capability of simultaneously learning multiple relevant regression tasks and its wide applications in data mining, computer vision and medical image analysis, while great challenges arise from jointly handling inter-target correlations and input-output relationships. In this paper, we propose Multi-layer Multi-target Regression (MMR) which enables simultaneously modeling intrinsic inter-target correlations and nonlinear input-output relationships in a general framework via robust low-rank learning. Specifically, the MMR can explicitly encode inter-target correlations in a structure matrix by matrix elastic nets (MEN); the MMR can work in conjunction with the kernel trick to effectively disentangle highly complex nonlinear input-output relationships; the MMR can be efficiently solved by a new alternating optimization algorithm with guaranteed convergence. The MMR leverages the strength of kernel methods for nonlinear feature learning and the structural advantage of multi-layer learning architectures for inter-target correlation modeling. More importantly, it offers a new multi-layer learning paradigm for multi-target regression which is endowed with high generality, flexibility and expressive ability. Extensive experimental evaluation on 18 diverse real-world datasets demonstrates that our MMR can achieve consistently high performance and outperforms representative state-of-the-art algorithms, which shows its great effectiveness and generality for multivariate prediction.Multi-target regression has recently regained great popularity due to its capability of simultaneously learning multiple relevant regression tasks and its wide applications in data mining, computer vision and medical image analysis, while great challenges arise from jointly handling inter-target correlations and input-output relationships. In this paper, we propose Multi-layer Multi-target Regression (MMR) which enables simultaneously modeling intrinsic inter-target correlations and nonlinear input-output relationships in a general framework via robust low-rank learning. Specifically, the MMR can explicitly encode inter-target correlations in a structure matrix by matrix elastic nets (MEN); the MMR can work in conjunction with the kernel trick to effectively disentangle highly complex nonlinear input-output relationships; the MMR can be efficiently solved by a new alternating optimization algorithm with guaranteed convergence. The MMR leverages the strength of kernel methods for nonlinear feature learning and the structural advantage of multi-layer learning architectures for inter-target correlation modeling. More importantly, it offers a new multi-layer learning paradigm for multi-target regression which is endowed with high generality, flexibility and expressive ability. Extensive experimental evaluation on 18 diverse real-world datasets demonstrates that our MMR can achieve consistently high performance and outperforms representative state-of-the-art algorithms, which shows its great effectiveness and generality for multivariate prediction. |
Author | Xiaofei He Mengyang Yu Xiantong Zhen Shuo Li |
Author_xml | – sequence: 1 givenname: Xiantong orcidid: 0000-0001-5213-0462 surname: Zhen fullname: Zhen, Xiantong – sequence: 2 givenname: Mengyang surname: Yu fullname: Yu, Mengyang – sequence: 3 givenname: Xiaofei surname: He fullname: He, Xiaofei – sequence: 4 givenname: Shuo surname: Li fullname: Li, Shuo |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/28368816$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kEtLw0AQgBep2Pr4AwoS8OIldR_ZzexRxBdElFLPyyaZlNU0qbuJ4r832uqhB09z-b5h5tsno6ZtkJBjRqeMUX0xf7p8uJ9yytIpVwBCiR0y4UzRWHPNR2RCmeIxAIcx2Q_hhVKWSCr2yJgPMABTE5I89HXn4rn1C-yiGS48huDaJnp3Npq1eR-6KGs_4pltXqMMrW9cszgku5WtAx5t5gF5vrmeX93F2ePt_dVlFhdCsi7OAZmutLCoygooSp7y0qaltIW0WhUgqkQyKkpQilErKiEsyDKtEp1omubigJyv9658-9Zj6MzShQLr2jbY9sEwgIQpBTod0LMt9KXtfTNcZzhLh7e15HKgTjdUny-xNCvvltZ_mt8eAwBroPBtCB4rU7jOdkOQzltXG0bNd3rzk958pzeb9IPKt9Tf7f9KJ2vJIeKfkAKA1Fp8AXMjjGw |
CODEN | ITPIDJ |
CitedBy_id | crossref_primary_10_1109_TFUZZ_2020_2983667 crossref_primary_10_1109_TPAMI_2020_3015859 crossref_primary_10_1088_1742_6596_2113_1_012023 crossref_primary_10_3233_JIFS_220412 crossref_primary_10_1142_S012906571950014X crossref_primary_10_1016_j_media_2020_101723 crossref_primary_10_1109_TFUZZ_2021_3083956 crossref_primary_10_1016_j_compbiomed_2022_105705 crossref_primary_10_1145_3398728 crossref_primary_10_1016_j_patcog_2020_107640 crossref_primary_10_1109_TCSVT_2018_2849757 crossref_primary_10_1109_OJCAS_2024_3389100 crossref_primary_10_1016_j_ijsrc_2023_08_004 crossref_primary_10_1016_j_knosys_2020_106199 crossref_primary_10_1016_j_patrec_2023_05_036 crossref_primary_10_1109_TCSVT_2018_2823360 crossref_primary_10_1016_j_eswa_2023_122845 crossref_primary_10_1016_j_neucom_2020_12_060 crossref_primary_10_1016_j_neucom_2024_127533 crossref_primary_10_1016_j_neunet_2024_106619 crossref_primary_10_1016_j_fss_2020_08_012 crossref_primary_10_1109_TCSVT_2020_3032964 crossref_primary_10_3390_a16080365 crossref_primary_10_3390_math12030372 crossref_primary_10_1007_s10489_021_02238_0 crossref_primary_10_1007_s10489_024_05570_3 crossref_primary_10_1016_j_eswa_2024_123245 crossref_primary_10_1109_TCYB_2020_3017736 crossref_primary_10_1007_s10489_021_02291_9 crossref_primary_10_1007_s13042_021_01329_1 crossref_primary_10_1109_TPAMI_2022_3222732 crossref_primary_10_1109_TIE_2020_3031525 crossref_primary_10_1109_TIP_2018_2866688 crossref_primary_10_1109_TASE_2023_3290352 crossref_primary_10_3390_app13010435 crossref_primary_10_1007_s10994_022_06170_3 crossref_primary_10_1007_s13042_020_01268_3 crossref_primary_10_1007_s10489_020_02112_5 crossref_primary_10_1109_ACCESS_2019_2913898 crossref_primary_10_1016_j_neunet_2020_09_021 crossref_primary_10_1016_j_media_2019_101591 crossref_primary_10_1016_j_knosys_2024_111566 crossref_primary_10_1016_j_neunet_2019_10_002 crossref_primary_10_1109_TSMC_2021_3102978 crossref_primary_10_1016_j_media_2019_101593 crossref_primary_10_1109_ACCESS_2018_2862159 crossref_primary_10_1016_j_inffus_2019_12_008 crossref_primary_10_1109_TAI_2022_3162570 crossref_primary_10_1016_j_asoc_2020_106255 crossref_primary_10_1016_j_media_2019_101554 crossref_primary_10_1002_eqe_3856 crossref_primary_10_1109_ACCESS_2023_3306721 crossref_primary_10_1016_j_knosys_2021_107226 crossref_primary_10_1016_j_patcog_2024_111289 crossref_primary_10_1007_s13042_023_02075_2 crossref_primary_10_3233_ICA_180581 crossref_primary_10_1016_j_neucom_2021_12_048 crossref_primary_10_1016_j_media_2019_101568 crossref_primary_10_1016_j_scitotenv_2020_140162 crossref_primary_10_1016_j_eswa_2022_118889 crossref_primary_10_1109_TPAMI_2019_2893953 crossref_primary_10_1109_TNNLS_2020_2984635 crossref_primary_10_1007_s11082_023_05817_2 crossref_primary_10_1109_TIP_2020_3024728 crossref_primary_10_1016_j_tust_2024_105960 crossref_primary_10_1109_TCSVT_2018_2832095 crossref_primary_10_1016_j_knosys_2018_06_032 crossref_primary_10_1109_LGRS_2021_3066346 crossref_primary_10_1016_j_inpa_2019_07_001 crossref_primary_10_1109_TCSVT_2019_2952646 crossref_primary_10_1109_ACCESS_2019_2945084 crossref_primary_10_3389_fnagi_2022_810873 crossref_primary_10_1109_TAFFC_2020_3043135 crossref_primary_10_1093_bib_bbad043 crossref_primary_10_1007_s41060_021_00274_0 crossref_primary_10_1016_j_patcog_2020_107311 crossref_primary_10_1016_j_eswa_2022_119208 crossref_primary_10_1016_j_neucom_2023_127226 crossref_primary_10_34133_plantphenomics_0146 crossref_primary_10_1016_j_eswa_2023_122430 crossref_primary_10_1109_TIP_2019_2952739 |
Cites_doi | 10.1145/2783258.2783393 10.1214/08-AOS625 10.1145/2623330.2623641 10.1002/widm.1157 10.1561/9781601985590 10.1109/TSP.2004.831028 10.1111/j.2517-6161.1996.tb02080.x 10.1109/LGRS.2011.2109934 10.1109/ICDM.2014.125 10.1017/CBO9780511809682 10.1137/070697835 10.1007/978-3-319-10605-2_36 10.1109/TPAMI.2010.160 10.1198/jcgs.2010.09188 10.1109/TNNLS.2012.2188906 10.1109/TPAMI.2015.2477843 10.1109/TSP.2004.830991 10.1016/j.neucom.2013.02.024 10.1111/j.1467-9868.2005.00503.x 10.1145/2339530.2339672 10.2140/pjm.1966.16.1 10.1007/s10994-007-5040-8 10.1109/TPAMI.2015.2452911 10.1016/j.media.2015.07.003 10.2307/1969418 10.1023/A:1007379606734 10.1145/2086737.2086742 10.1145/1102351.1102479 10.1007/s10994-016-5546-z 10.1214/aoms/1177697089 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TPAMI.2017.2688363 |
DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic Technology Research Database PubMed |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Computer Science |
EISSN | 2160-9292 1939-3539 |
EndPage | 504 |
ExternalDocumentID | 28368816 10_1109_TPAMI_2017_2688363 7888599 |
Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ IEDLZ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 UHB ~02 AAYXX CITATION RIG 5VS 9M8 AAYOK ABFSI ADRHT AETIX AGSQL AI. AIBXA ALLEH FA8 H~9 IBMZZ ICLAB IFJZH NPM PKN RIC RNI RZB VH1 XJT Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c351t-b8e19f93ae6df80e5272da7d5ac5a96c83f45103d86610a3f33a85d7f494907b3 |
IEDL.DBID | RIE |
ISSN | 0162-8828 1939-3539 |
IngestDate | Fri Jul 11 09:10:51 EDT 2025 Mon Jun 30 05:16:21 EDT 2025 Wed Feb 19 02:32:29 EST 2025 Tue Jul 01 03:18:23 EDT 2025 Thu Apr 24 23:09:36 EDT 2025 Wed Aug 27 02:47:50 EDT 2025 |
IsDoiOpenAccess | false |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 2 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c351t-b8e19f93ae6df80e5272da7d5ac5a96c83f45103d86610a3f33a85d7f494907b3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-5213-0462 |
PMID | 28368816 |
PQID | 2174509525 |
PQPubID | 85458 |
PageCount | 8 |
ParticipantIDs | proquest_miscellaneous_1884166897 crossref_citationtrail_10_1109_TPAMI_2017_2688363 pubmed_primary_28368816 crossref_primary_10_1109_TPAMI_2017_2688363 ieee_primary_7888599 proquest_journals_2174509525 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2018-02-01 |
PublicationDateYYYYMMDD | 2018-02-01 |
PublicationDate_xml | – month: 02 year: 2018 text: 2018-02-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
PublicationTitleAbbrev | TPAMI |
PublicationTitleAlternate | IEEE Trans Pattern Anal Mach Intell |
PublicationYear | 2018 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 bargi (ref24) 2014 ref59 ref14 krizhevsky (ref27) 2012 ref53 ref52 ref55 ref11 ref10 daumé (ref20) 2009 jacob (ref15) 2009 ref16 ref19 rakitsch (ref49) 2013 dinuzzo (ref18) 2012 nie (ref47) 2010 ref50 ref46 ref45 platt (ref56) 1999 long (ref9) 2015 zheng (ref48) 2015 ref42 ref43 golub (ref54) 2012; 3 zou (ref63) 2009; 37 evgeniou (ref23) 2005; 6 ref8 wang (ref30) 2014 ref4 ref3 liu (ref35) 2014 ref40 friedman (ref37) 2001; 1 liu (ref17) 2015; 16 ref34 zhong (ref39) 2015 ref31 sun (ref12) 2011; 33 ref33 ref1 ref38 gillberg (ref44) 2016; 17 wilson (ref28) 2016 wang (ref2) 2015 rai (ref13) 2012 rahimi (ref58) 2007 ciliberto (ref32) 2015 zhang (ref5) 2010 bishop (ref51) 2006 ref25 ando (ref6) 2005; 6 wilson (ref26) 2012 nie (ref41) 2012 aho (ref62) 2012; 13 liu (ref29) 2009 argyriou (ref7) 2006 ref60 ref61 álvarez (ref22) 2012 dinuzzo (ref36) 2011 kumar (ref21) 2012 |
References_xml | – volume: 13 start-page: 2367 year: 2012 ident: ref62 article-title: Multi-target regression with rule ensembles publication-title: J Mach Learn Res – ident: ref16 doi: 10.1145/2783258.2783393 – volume: 37 year: 2009 ident: ref63 article-title: On the adaptive elastic-net with a diverging number of parameters publication-title: Ann Statist doi: 10.1214/08-AOS625 – ident: ref45 doi: 10.1145/2623330.2623641 – start-page: 1548 year: 2015 ident: ref32 article-title: Convex learning of multiple tasks and their structure publication-title: Proc Int Conf Mach Learn – ident: ref10 doi: 10.1002/widm.1157 – year: 2012 ident: ref22 publication-title: Kernels for Vector-valued Functions A Review doi: 10.1561/9781601985590 – start-page: 135 year: 2009 ident: ref20 article-title: Bayesian multitask learning with latent hierarchies publication-title: Proc 25th Conf Uncertainty Artif Intell – ident: ref50 doi: 10.1109/TSP.2004.831028 – start-page: 1973 year: 2015 ident: ref48 article-title: A closed form solution to multi-view low-rank regression publication-title: Proc 29th AAAI Conf Artif Intell – ident: ref43 doi: 10.1111/j.2517-6161.1996.tb02080.x – ident: ref61 doi: 10.1109/LGRS.2011.2109934 – start-page: 77 year: 2014 ident: ref24 article-title: A non-parametric conditional factor regression model for multi-dimensional input and response publication-title: Proc 17th Int Conf Artif Intell Statist – year: 2006 ident: ref51 publication-title: Pattern Recognition and Machine Learning – start-page: 3185 year: 2012 ident: ref13 article-title: Simultaneously leveraging output and task structures for multiple-output regression publication-title: Proc Advances Neural Inf Process Syst – volume: 16 start-page: 1579 year: 2015 ident: ref17 article-title: Calibrated multivariate regression with application to neural semantic basis discovery publication-title: J Mach Learn Res – ident: ref34 doi: 10.1109/ICDM.2014.125 – volume: 1 year: 2001 ident: ref37 publication-title: The Elements of Statistical Learning – volume: 6 start-page: 615 year: 2005 ident: ref23 article-title: Learning multiple tasks with kernel methods publication-title: J Mach Learn Res – start-page: 655 year: 2012 ident: ref41 article-title: Low-rank matrix recovery via efficient Schatten p-norm minimization publication-title: Proc 26th AAAI Conf Artif Intell – start-page: 599 year: 2012 ident: ref26 article-title: Gaussian process regression networks publication-title: Proc Int Conf Mach Learn – ident: ref38 doi: 10.1017/CBO9780511809682 – ident: ref40 doi: 10.1137/070697835 – ident: ref8 doi: 10.1007/978-3-319-10605-2_36 – volume: 33 start-page: 194 year: 2011 ident: ref12 article-title: Canonical correlation analysis for multilabel classification: A least-squares formulation, extensions, and analysis publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2010.160 – ident: ref11 doi: 10.1198/jcgs.2010.09188 – ident: ref25 doi: 10.1109/TNNLS.2012.2188906 – ident: ref3 doi: 10.1109/TPAMI.2015.2477843 – start-page: 733 year: 2010 ident: ref5 article-title: A convex formulation for learning task relationships in multi-task learning publication-title: Proc 26th Conf Uncertainty Artif Intell – start-page: 745 year: 2009 ident: ref15 article-title: Clustered multi-task learning: A convex formulation publication-title: Proc Advances Neural Inf Process Syst – start-page: 1209 year: 2015 ident: ref2 article-title: Multi-task learning for subspace segmentation publication-title: Proc Int Conf Mach Learn – ident: ref57 doi: 10.1109/TSP.2004.830991 – ident: ref52 doi: 10.1016/j.neucom.2013.02.024 – start-page: 2411 year: 2014 ident: ref30 article-title: On multiplicative multitask feature learning publication-title: Proc Advances Neural Inf Process Syst – start-page: 97 year: 2015 ident: ref9 article-title: Learning transferable features with deep adaptation networks publication-title: Proc Int Conf Mach Learn – start-page: 41 year: 2006 ident: ref7 article-title: Multi-task feature learning publication-title: Proc Advances Neural Inf Process Syst – ident: ref42 doi: 10.1111/j.1467-9868.2005.00503.x – start-page: 1177 year: 2007 ident: ref58 article-title: Random features for large-scale kernel machines publication-title: Proc Advances Neural Inf Process Syst – start-page: 1813 year: 2010 ident: ref47 article-title: Efficient and robust feature selection via joint $\ell _{2,1}$ -norms minimization publication-title: Proc Advances Neural Inf Process Syst – ident: ref53 doi: 10.1145/2339530.2339672 – start-page: 1466 year: 2013 ident: ref49 article-title: It is all in the noise: Efficient multi-task Gaussian process inference with structured residuals publication-title: Proc Advances Neural Inf Process Syst – ident: ref55 doi: 10.2140/pjm.1966.16.1 – ident: ref14 doi: 10.1007/s10994-007-5040-8 – start-page: 49 year: 2011 ident: ref36 article-title: Learning output kernels with block coordinate descent publication-title: Proc Int Conf Mach Learn – start-page: 1097 year: 2012 ident: ref27 article-title: ImageNet classification with deep convolutional neural networks publication-title: Proc Advances Neural Inf Process Syst – volume: 17 start-page: 1 year: 2016 ident: ref44 article-title: Multiple output regression with latent noise publication-title: J Mach Learn Res – start-page: 185 year: 1999 ident: ref56 article-title: 12 fast training of support vector machines using sequential minimal optimization publication-title: Advances in Kernel Methods – volume: 6 start-page: 1817 year: 2005 ident: ref6 article-title: A framework for learning predictive structures from multiple tasks and unlabeled data publication-title: J Mach Learn Res – ident: ref33 doi: 10.1109/TPAMI.2015.2452911 – ident: ref4 doi: 10.1016/j.media.2015.07.003 – ident: ref59 doi: 10.2307/1969418 – volume: 3 year: 2012 ident: ref54 publication-title: Matrix Computations – start-page: 370 year: 2016 ident: ref28 article-title: Deep kernel learning publication-title: Proc 19th Int Conf Artif Intell – start-page: 1383 year: 2012 ident: ref21 article-title: Learning task grouping and overlap in multi-task learning publication-title: Proc Int Conf Mach Learn – ident: ref1 doi: 10.1023/A:1007379606734 – ident: ref31 doi: 10.1145/2086737.2086742 – ident: ref19 doi: 10.1145/1102351.1102479 – start-page: 1980 year: 2015 ident: ref39 article-title: A nonconvex relaxation approach for rank minimization problems publication-title: Proc 29th AAAI Conf Artif Intell – ident: ref60 doi: 10.1007/s10994-016-5546-z – start-page: 189 year: 2012 ident: ref18 article-title: The representer theorem for Hilbert spaces: A necessary and sufficient condition publication-title: Proc Advances Neural Inf Process Syst – ident: ref46 doi: 10.1214/aoms/1177697089 – start-page: 339 year: 2009 ident: ref29 article-title: Multi-task feature learning via efficient l 2, 1-norm minimization publication-title: Proc 25th Conf Uncertainty Artif Intell – start-page: 127 year: 2014 ident: ref35 article-title: Multivariate regression with calibration publication-title: Proc Advances Neural Inf Process Syst |
SSID | ssj0014503 |
Score | 2.5722225 |
Snippet | Multi-target regression has recently regained great popularity due to its capability of simultaneously learning multiple relevant regression tasks and its wide... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 497 |
SubjectTerms | Algorithms Biomedical imaging Computational modeling Computer vision Correlation Data mining Data models Image analysis Input output Kernel Kernels Learning matrix elastic nets Medical imaging Modelling multi-layer learning multi-target regression Multilayers Optimization Regression analysis Robust low-rank learning Robustness Robustness (mathematics) State of the art |
Title | Multi-Target Regression via Robust Low-Rank Learning |
URI | https://ieeexplore.ieee.org/document/7888599 https://www.ncbi.nlm.nih.gov/pubmed/28368816 https://www.proquest.com/docview/2174509525 https://www.proquest.com/docview/1884166897 |
Volume | 40 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8QwEB7WPenB1fVVX1Twpl3bJmmTo4ii4oosK3graZqIrLTitgr-epP0gYiKt0CTtM3MdGY6M98AHGYySJVv4oNRyD2cYmUwIKUniFBcm-Tct9mE49vo8h5fP5CHHhx3tTBSSpt8JkdmaGP5WSEq86vsxLhrhLEFWNCjularixhgYrsgawtGS7h2I9oCGZ-dTO9Ox1cmiysehRGlKDLNc7Ra1WPT5vyLPrINVn63Na3OuRjAuH3aOtVkNqrKdCQ-vgE5_vd1VmC5MT7d05pbVqEn8yEM2sYObiPnQ1j6glK4BtgW6XpTmzPuTuRjnTqbu29P3J0UaTUv3Zvi3ZvwfOY2eK2P63B_cT49u_SaZgueQCQovZTKgCmGuIwyRX1JwjjMeJwRLghnkaBIYYO-l1Gt0X2OFEKckixWBt7Gj1O0Af28yOUWuBhhvQ5ToT8QmOtdEOYoYkLFzFeSIAeC9sgT0SCRm4YYz4n1SHyWWIolhmJJQzEHjro1LzUOx5-z18xxdzObk3Zgt6Vs0ojqPDE-mbaaSEgcOOguayEzkROey6KaJwE10dmIstiBzZojur1bRtr--Z47sKifjNaJ3rvQL18ruaftmDLdtwz8CRDS6Y8 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3dT9wwDLcOeIA9wIAxCgyKxBvr0TZJmzyeJtCx3SF0OiTeojRNEAK1iGtB2l-_JP0QQtvEW6QmaRvbtV3bPwOc5CrKdGjjg0ksApxhbTEgVSCJ1MKY5CJ02YTTq2R8g3_ektsBfO9rYZRSLvlMDe3QxfLzUtb2V9mZddcIY0uwYvQ-iZtqrT5mgInrg2xsGCPjxpHoSmRCdja_Hk0vbR5XOowTSlFi2-cYxWrGttH5G43kWqz829p0WudiA6bd8zbJJg_DusqG8vc7KMePvtBnWG_NT3_U8MsmDFSxBRtdawe_lfQt-PQGp3AbsCvTDeYua9yfqbsmebbwX-6FPyuzelH5k_I1mIniwW8RW---wM3F-fzHOGjbLQQSkagKMqoiphkSKsk1DRWJ0zgXaU6EJIIlkiKNLf5eTo1ODwXSCAlK8lRbgJswzdAOLBdloXbBxwibdZhK84nAwuyCsEAJkzploVYEeRB1R85li0VuW2I8cueThIw7inFLMd5SzIPTfs1Tg8Tx39nb9rj7me1Je3DQUZa3wrrg1iszdhOJiQfH_WUjZjZ2IgpV1gseURufTShLPfjacES_d8dIe3-_5xGsjufTCZ9cXv3ahzXzlLRJ-z6A5eq5Vt-MVVNlh46Z_wBuaezZ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multi-Target+Regression+via+Robust+Low-Rank+Learning&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Xiantong+Zhen&rft.au=Mengyang+Yu&rft.au=Xiaofei+He&rft.au=Shuo+Li&rft.date=2018-02-01&rft.pub=IEEE&rft.issn=0162-8828&rft.volume=40&rft.issue=2&rft.spage=497&rft.epage=504&rft_id=info:doi/10.1109%2FTPAMI.2017.2688363&rft_id=info%3Apmid%2F28368816&rft.externalDocID=7888599 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |