Alternating DCA for reduced-rank multitask linear regression with covariance matrix estimation
We study a challenging problem in machine learning that is the reduced-rank multitask linear regression with covariance matrix estimation. The objective is to build a linear relationship between multiple output variables and input variables of a multitask learning process, taking into account the ge...
Saved in:
Published in | Annals of mathematics and artificial intelligence Vol. 90; no. 7-9; pp. 809 - 829 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Cham
Springer International Publishing
01.09.2022
Springer Springer Nature B.V Springer Verlag |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | We study a challenging problem in machine learning that is the reduced-rank multitask linear regression with covariance matrix estimation. The objective is to build a linear relationship between multiple output variables and input variables of a multitask learning process, taking into account the general covariance structure for the errors of the regression model in one hand, and reduced-rank regression model in another hand. The problem is formulated as minimizing a nonconvex function in two joint matrix variables (
X
,
Θ
) under the low-rank constraint on
X
and positive definiteness constraint on
Θ
. It has a double difficulty due to the non-convexity of the objective function as well as the low-rank constraint. We investigate a nonconvex, nonsmooth optimization approach based on DC (Difference of Convex functions) programming and DCA (DC Algorithm) for this hard problem. A penalty reformulation is considered which takes the form of a partial DC program. An alternating DCA and its inexact version are developed, both algorithms converge to a weak critical point of the considered problem. Numerical experiments are performed on several synthetic and benchmark real multitask linear regression datasets. The numerical results show the performance of the proposed algorithms and their superiority compared with three classical alternating/joint methods. |
---|---|
AbstractList | We study a challenging problem in machine learning that is the reduced-rank multitask linear regression with covariance matrix estimation. The objective is to build a linear relationship between multiple output variables and input variables of a multitask learning process, taking into account the general covariance structure for the errors of the regression model in one hand, and reduced-rank regression model in another hand. The problem is formulated as minimizing a nonconvex function in two joint matrix variables (X, [THETA]) under the low-rank constraint on X and positive definiteness constraint on [THETA]. It has a double difficulty due to the non-convexity of the objective function as well as the low-rank constraint. We investigate a nonconvex, nonsmooth optimization approach based on DC (Difference of Convex functions) programming and DCA (DC Algorithm) for this hard problem. A penalty reformulation is considered which takes the form of a partial DC program. An alternating DCA and its inexact version are developed, both algorithms converge to a weak critical point of the considered problem. Numerical experiments are performed on several synthetic and benchmark real multitask linear regression datasets. The numerical results show the performance of the proposed algorithms and their superiority compared with three classical alternating/joint methods. Keywords Reduced-rank multitask linear regression * Covariance matrix estimation * DC programming * DCA * Partial DC program * Alternating DCA Mathematics Subject Classification (2010) 90C26 * 90C90 * 62J05 We study a challenging problem in machine learning that is the reduced-rank multitask linear regression with covariance matrix estimation. The objective is to build a linear relationship between multiple output variables and input variables of a multitask learning process, taking into account the general covariance structure for the errors of the regression model in one hand, and reduced-rank regression model in another hand. The problem is formulated as minimizing a nonconvex function in two joint matrix variables ( X , Θ ) under the low-rank constraint on X and positive definiteness constraint on Θ . It has a double difficulty due to the non-convexity of the objective function as well as the low-rank constraint. We investigate a nonconvex, nonsmooth optimization approach based on DC (Difference of Convex functions) programming and DCA (DC Algorithm) for this hard problem. A penalty reformulation is considered which takes the form of a partial DC program. An alternating DCA and its inexact version are developed, both algorithms converge to a weak critical point of the considered problem. Numerical experiments are performed on several synthetic and benchmark real multitask linear regression datasets. The numerical results show the performance of the proposed algorithms and their superiority compared with three classical alternating/joint methods. We study a challenging problem in machine learning that is the reduced-rank multitask linear regression with covariance matrix estimation. The objective is to build a linear relationship between multiple output variables and input variables of a multitask learning process, taking into account the general covariance structure for the errors of the regression model in one hand, and reduced-rank regression model in another hand. The problem is formulated as minimizing a nonconvex function in two joint matrix variables (X,Θ) under the low-rank constraint on X and positive definiteness constraint on Θ. It has a double difficulty due to the non-convexity of the objective function as well as the low-rank constraint. We investigate a nonconvex, nonsmooth optimization approach based on DC (Difference of Convex functions) programming and DCA (DC Algorithm) for this hard problem. A penalty reformulation is considered which takes the form of a partial DC program. An alternating DCA and its inexact version are developed, both algorithms converge to a weak critical point of the considered problem. Numerical experiments are performed on several synthetic and benchmark real multitask linear regression datasets. The numerical results show the performance of the proposed algorithms and their superiority compared with three classical alternating/joint methods. |
Audience | Academic |
Author | Le Thi, Hoai An Ho, Vinh Thanh |
Author_xml | – sequence: 1 givenname: Hoai An orcidid: 0000-0002-2239-2100 surname: Le Thi fullname: Le Thi, Hoai An email: hoai-an.le-thi@univ-lorraine.fr organization: Université de Lorraine, LGIPM, Le Département IA – sequence: 2 givenname: Vinh Thanh surname: Ho fullname: Ho, Vinh Thanh organization: Université de Lorraine, LGIPM, Le Département IA |
BackLink | https://hal.univ-lorraine.fr/hal-03212805$$DView record in HAL |
BookMark | eNp9kU1vFSEUhompiW31D7giceWC9vAxw7CcXD9qchM3XUsoA7e0c6HC3Kr_3nMdTRMXDQtODu8DnPc9Iye55EDIWw4XHEBfNg5KCwaCMzBaCja8IKe805JppeEEa-CCCaXkK3LW2h0AmH7oT8m3cV5CzW5JeUc_bEYaS6U1TAcfJlZdvqf7w7ykxbV7Oqcc3PF0V0NrqWT6Iy231JdHV5PLPtC9W2r6SUNbEpaoeE1eRje38Obvfk6uP3283lyx7dfPXzbjlnlpxMJkMJ3XRk9eB_AwKaWnqLrYxSg1TFp60wvRmdiFG9AqgBShj7yLToubQctz8n699tbN9qHi4_WXLS7Zq3Frjz0EuBige-SofbdqH2r5fsCv2rtyQAfmZoXhgwAhBomqi1W1c3OwKceyVOdxTWGfPJofE_ZHzRVoY3pAYFgBX0trNUTr0bWjBwim2XKwx6TsmpTFpOyfpOyAqPgP_TfCs5BcoYbivAv1aYxnqN9xMKeE |
CitedBy_id | crossref_primary_10_1038_s41598_022_21889_8 |
Cites_doi | 10.1093/bioinformatics/btw249 10.1007/BF02288367 10.1007/s10898-018-0698-y 10.14492/hokmj/1381757647 10.1080/02331934.2019.1648467 10.1007/978-1-4757-2853-8 10.1007/978-3-319-06569-4_2 10.1137/S1052623494274313 10.1142/5021 10.1007/978-3-642-54455-2_1 10.1002/9780470057339.var024 10.1016/0022-247X(77)90060-9 10.1080/01621459.2012.734178 10.1080/00207543.2019.1657245 10.1007/s11222-014-9517-6 10.1016/j.amc.2017.08.061 10.1016/j.ejor.2014.11.031 10.1016/j.cor.2016.11.003 10.1007/s10287-009-0098-3 10.1016/S0169-7439(01)00155-1 10.1016/j.neunet.2019.05.011 10.1007/s10479-016-2333-y 10.1016/j.neunet.2020.09.021 10.1016/0022-2496(85)90006-9 10.1162/neco_a_01266 10.1007/s10898-011-9765-3 10.1007/s11081-017-9359-0 10.1111/j.1467-9868.2007.00591.x 10.1162/NECO_a_00673 10.1016/j.neucom.2015.12.068 10.1007/978-0-387-77117-5 10.1111/j.1464-410X.2011.10665.x 10.1137/0609033 10.1016/j.neucom.2014.11.051 10.1007/s10994-016-5546-z 10.1016/S1053-8119(03)00160-5 10.1016/0047-259X(75)90042-1 10.1089/end.2012.0147 10.1016/j.patcog.2013.07.012 |
ContentType | Journal Article |
Copyright | The Author(s), under exclusive licence to Springer Nature Switzerland AG part of Springer Nature 2021 COPYRIGHT 2022 Springer The Author(s), under exclusive licence to Springer Nature Switzerland AG part of Springer Nature 2021. Distributed under a Creative Commons Attribution 4.0 International License |
Copyright_xml | – notice: The Author(s), under exclusive licence to Springer Nature Switzerland AG part of Springer Nature 2021 – notice: COPYRIGHT 2022 Springer – notice: The Author(s), under exclusive licence to Springer Nature Switzerland AG part of Springer Nature 2021. – notice: Distributed under a Creative Commons Attribution 4.0 International License |
DBID | AAYXX CITATION 8FE 8FG ABJCF AFKRA ARAPS AZQEC BENPR BGLVJ CCPQU DWQXO GNUQQ HCIFZ JQ2 K7- L6V M7S P5Z P62 PHGZM PHGZT PKEHL PQEST PQGLB PQQKQ PQUKI PRINS PTHSS 1XC |
DOI | 10.1007/s10472-021-09732-8 |
DatabaseName | CrossRef ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One ProQuest Central Korea ProQuest Central Student SciTech Premium Collection ProQuest Computer Science Collection Computer Science Database ProQuest Engineering Collection Engineering Database ProQuest advanced technologies & aerospace journals ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic (New) ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Engineering collection Hyper Article en Ligne (HAL) |
DatabaseTitle | CrossRef Computer Science Database ProQuest Central Student Technology Collection ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection SciTech Premium Collection ProQuest One Community College ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest Engineering Collection ProQuest Central Korea ProQuest Central (New) Engineering Collection Advanced Technologies & Aerospace Collection Engineering Database ProQuest One Academic Eastern Edition ProQuest Technology Collection ProQuest SciTech Collection Advanced Technologies & Aerospace Database ProQuest One Academic UKI Edition Materials Science & Engineering Collection ProQuest One Academic ProQuest One Academic (New) |
DatabaseTitleList | Computer Science Database |
Database_xml | – sequence: 1 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Mathematics Computer Science |
EISSN | 1573-7470 |
EndPage | 829 |
ExternalDocumentID | oai_HAL_hal_03212805v1 A714079960 10_1007_s10472_021_09732_8 |
GroupedDBID | -4Z -59 -5G -BR -EM -Y2 -~C .86 .DC .VR 06D 0R~ 0VY 1N0 1SB 2.D 203 23M 28- 2J2 2JN 2JY 2KG 2LR 2P1 2VQ 2~H 30V 4.4 406 408 409 40D 40E 5GY 5QI 5VS 67Z 6NX 8TC 8UJ 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAOBN AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN AAYTO AAYZH ABAKF ABBBX ABBXA ABDZT ABECU ABFTD ABFTV ABHLI ABHQN ABJCF ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABWNU ABXPI ACAOD ACBXY ACDTI ACGFS ACHSB ACHXU ACIWK ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACSNA ACZOJ ADHHG ADHIR ADINQ ADKNI ADKPE ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFEXP AFGCZ AFKRA AFLOW AFQWF AFWTZ AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHSBF AHYZX AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARAPS ARMRJ ASPBG AVWKF AXYYD AYJHY AZFZN B-. BA0 BBWZM BDATZ BENPR BGLVJ BGNMA BSONS CAG CCPQU COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP EBLON EBS EIOEI EJD ESBYG F5P FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNWQR GQ6 GQ7 GQ8 GXS HCIFZ HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I09 IAO IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ K7- KDC KOV KOW LAK LLZTM M4Y M7S MA- N2Q NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM OVD P19 P2P P9O PF0 PT4 PT5 PTHSS QOK QOS R4E R89 R9I RHV RNI RNS ROL RPX RSV RZC RZE RZK S16 S1Z S26 S27 S28 S3B SAP SCJ SCLPG SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TEORI TN5 TSG TSK TSV TUC U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR Z45 Z7R Z7X Z81 Z83 Z88 Z92 ZMTXR ~A9 ~EX AAPKM AAYXX ABBRH ABDBE ABFSG ACSTC ADHKG ADKFA AEZWR AFDZB AFHIU AFOHR AGQPQ AHPBZ AHWEU AIXLP ATHPR AYFIA CITATION PHGZM PHGZT AEIIB PMFND 8FE 8FG ABRTQ AZQEC DWQXO GNUQQ JQ2 L6V P62 PKEHL PQEST PQGLB PQQKQ PQUKI PRINS 1XC |
ID | FETCH-LOGICAL-c392t-3e95c797dc7e0c0d447df45f5ff370d73c962259f5eb074e032e6f15fa72b873 |
IEDL.DBID | BENPR |
ISSN | 1012-2443 |
IngestDate | Fri May 09 12:15:36 EDT 2025 Fri Jul 25 11:02:58 EDT 2025 Tue Jun 10 21:26:00 EDT 2025 Thu Apr 24 22:57:58 EDT 2025 Tue Jul 01 03:19:44 EDT 2025 Fri Feb 21 02:46:23 EST 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 7-9 |
Keywords | DCA 62J05 Reduced-rank multitask linear regression DC programming 90C90 90C26 Partial DC program Covariance matrix estimation Alternating DCA Partial DC programming |
Language | English |
License | Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c392t-3e95c797dc7e0c0d447df45f5ff370d73c962259f5eb074e032e6f15fa72b873 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-2239-2100 0009-0009-1416-5360 |
PQID | 2918202283 |
PQPubID | 2043872 |
PageCount | 21 |
ParticipantIDs | hal_primary_oai_HAL_hal_03212805v1 proquest_journals_2918202283 gale_infotracacademiconefile_A714079960 crossref_citationtrail_10_1007_s10472_021_09732_8 crossref_primary_10_1007_s10472_021_09732_8 springer_journals_10_1007_s10472_021_09732_8 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-09-01 |
PublicationDateYYYYMMDD | 2022-09-01 |
PublicationDate_xml | – month: 09 year: 2022 text: 2022-09-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | Cham |
PublicationPlace_xml | – name: Cham – name: Dordrecht |
PublicationTitle | Annals of mathematics and artificial intelligence |
PublicationTitleAbbrev | Ann Math Artif Intell |
PublicationYear | 2022 |
Publisher | Springer International Publishing Springer Springer Nature B.V Springer Verlag |
Publisher_xml | – name: Springer International Publishing – name: Springer – name: Springer Nature B.V – name: Springer Verlag |
References | WoldSSjöströmMErikssonLPLS-Regression: a basic tool of chemometricsChemom. Intell. Lab. Syst.200158210913010.1016/S0169-7439(01)00155-1 KoshiSConvergence of convex functions and dualityHokkaido Math. J.198514339941480882010.14492/hokmj/1381757647 Le ThiHAPham DinhTThe DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problemsAnn. Oper. Res.20051331–4234621193111116.90122 Le ThiHAPham DinhTLeHMVoXTDC Approximation approaches for sparse optimizationEur. J. Oper. Res.201524412646332076310.1016/j.ejor.2014.11.031 Le ThiHAHoVTPham DinhTA unified DC programming framework and efficient DCA based approaches for large scale batch reinforcement learningJ. Glob. Optim.2019732279310390836010.1007/s10898-018-0698-y Le ThiHALeHMPham DinhTNew and efficient DCA based algorithms for minimum sum-of-squares clusteringPattern Recogn.201447138840110.1016/j.patcog.2013.07.012 NguyenMNLe ThiHADanielGNguyenTASmoothing techniques and difference of convex functions algorithms for image reconstructionsOptim.2020697-816011633412090010.1080/02331934.2019.1648467 DevHSharmaNLDawsonSNNealDEShahNDetailed analysis of operating time learning curves in robotic prostatectomy by a novice surgeonBJU Int.201210971074108010.1111/j.1464-410X.2011.10665.x Le ThiHANguyenMCDCA Based algorithms for feature selection in multi-class support vector machineAnn. Oper. Res.20172491273300360441410.1007/s10479-016-2333-y Foygel, R., Horrell, M., Drton, M., Lafferty, J.: Nonparametric reduced rank regression. In: Pereira, F., Burges, C.J.C., Bottou, L., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, vol. 25, pp 1628–1636. Curran Associates, Inc. (2012) ChenLHuangJZSparse reduced-rank regression for simultaneous dimension reduction and variable selectionJ. Am. Stat. Assoc.201210750015331545303641410.1080/01621459.2012.734178 Le ThiHATaASPham DinhTAn efficient DCA based algorithm for power control in large scale wireless networksAppl. Math. Comput.201831821522637138571426.68012 YuanMEkiciALuZMonteiroRDimension reduction and coefficient estimation in multivariate linear regressionJournal of the Royal Statistical Society: Series B (Statistical Methodology)2007693329346232375610.1111/j.1467-9868.2007.00591.x Pham DinhTLe ThiHADC Optimization algorithms for solving the trust region subproblemSIAM J. Optim.199882476505161853110.1137/S1052623494274313 Aldrin, M.: Reduced-Rank Regression. Encyclopedia of Environmetrics, Vol. 3. Wiley, pp. 1724–1728 (2002) HyamsEMullinsJPierorazioPPartinAAllafMMatlagaBImpact of robotic technique and surgical volume on the cost of radical prostatectomyJ. Endourol.201327329830310.1089/end.2012.0147 Le Thi, H.A., Le, H.M., Phan, D.N., Tran, B.: Stochastic DCA for the large-sum of non-convex functions problem and its application to group variable selection in classification. In: Proceedings of the 34th International Conference on Machine Learning, Vol. 70. JMLR.org, pp. 3394–3403 (2017) Ha, W., Foygel Barber, R.: Alternating minimization and alternating descent over nonconvex sets. ArXiv e-prints arXiv:1709.04451 (2017) Le ThiHAPham DinhTDifference of convex functions algorithms (DCA) for image restoration via a Markov random field modelOptim. Eng.2017184873906371910010.1007/s11081-017-9359-0 SalinettiGWetsRJOn the relations between two types of convergence for convex functionsJ. Math. Anal. Appl.197760121122647939810.1016/0022-247X(77)90060-9 Pham DinhTLe ThiHAConvex analysis approach to DC programming: theory, algorithms and applicationsActa Math. Vietnam.199722128935514797510895.90152 Spyromitros-XioufisETsoumakasGGrovesWVlahavasIMulti-target regression via input space expansion: treating targets as inputsMach. Learn.201610415598351328410.1007/s10994-016-5546-z Le ThiHADC Programming and DCA for supply chain and production management: state-of-the-art models and methodsInt. J. Prod. Res.202058206078611410.1080/00207543.2019.1657245 Zălinescu, C.: Convex analysis in general vector spaces. World Scientific (2002) Dubois, B., Delmas, J.F., Obozinski, G.: Fast algorithms for sparse reduced-rank regression. In: Chaudhuri, K., Sugiyama, M. (eds.) Proceedings of Machine Learning Research, Proceedings of Machine Learning Research, vol. 89, pp 2415–2424. PMLR (2019) Le ThiHAHoVTOnline learning based on online DCA and application to online classificationNeural Comput.2020324759793410116210.1162/neco_a_01266 EckartCYoungGThe approximation of one matrix by another of lower rankPsychometrika1936121121810.1007/BF02288367 Ioffe, A., Tihomirov, V.: Theory of extremal problems. North-Holland (1979) IzenmanAJReduced-rank regression for the multivariate linear modelJ. Multivar. Anal.19755224826437317910.1016/0047-259X(75)90042-1 LeeCLLeeCAleeJHandbook of Quantitative Finance and Risk Management2010USASpringer10.1007/978-0-387-77117-5 Smith, A.E., Coit, D.W.: Constraint-handling techniques - penalty functions. In: Handbook of Evolutionary Computation, Oxford University Press, pp. C5.2:1–C5.2.6 (1997) ChenLHuangJZSparse reduced-rank regression with covariance estimationStat. Comput.2016261461470343938510.1007/s11222-014-9517-6 CoverTMThomasADeterminant inequalities via information theorySIAM J. Matrix Anal. Appl.19889338439294893610.1137/0609033 Le ThiHANguyenMCPham DinhTA DC programming approach for finding communities in networksNeural Comput.2014261228272854322319210.1162/NECO_a_00673 LeHMLe ThiHANguyenMCSparse semi-supervised support vector machines by DC programming and DCANeurocomputing2015153627610.1016/j.neucom.2014.11.051 PhanDNLe ThiHAGroup variable selection via ℓp,0 regularization and application to optimal scoringNeural Netw.201911822023410.1016/j.neunet.2019.05.011 Le Thi, H.A., Huynh, V.N., Pham Dinh, T.: DC Programming and DCA for General DC Programs. In: Van Do, T., Le Thi, H.A., Nguyen, N.T. (eds.) Advanced Computational Methods for Knowledge Engineering, vol. 282, pp 15–35. Springer International Publishing (2014) HeDParidaLKuhnDNovel applications of multitask learning and multiple output regression to multiple genetic trait predictionBioinformatics20163212i37i4310.1093/bioinformatics/btw249 Pham Dinh, T., Le Thi, H.A.: Recent Advances in DC Programming and DCA. In: Nguyen, N.T., Le Thi, H.A. (eds.) Transactions on Computational Intelligence XIII, vol. 8342, pp 1–37. Springer, Berlin (2014) Le ThiHAPortfolio selection under downside risk measures and cardinality constraints based on DC programming and DCAComput. Manag. Sci.200964459475253483210.1007/s10287-009-0098-3 ReinselGCVeluRPMultivariate Reduced-Rank regression: Theory and Applications, 1 edn. Lecture Notes in Statistics 1361998New YorkSpringer10.1007/978-1-4757-2853-8 Le Thi, H.A., Huynh, V.N., Pham Dinh, T.: Alternating DC Algorithm for Partial DC Programming. Technical report, University of Lorraine (2016) Le ThiHAPham DinhTNgaiHVExact penalty and error bounds in dc programmingJ. Glob. Optim.2012523509535289253410.1007/s10898-011-9765-3 Le ThiHASolving Large scale molecular distance geometry problems by a smoothing technique via the gaussian transform and D.C. ProgrammingJ. Glob. Optim.200327137539720128121064.90036 Le ThiHAPhanDNDC Programming and DCA for sparse optimal scoring problemNeurocomputing201618617018110.1016/j.neucom.2015.12.068 HarrisonLPennyWFristonKMultivariate autoregressive modeling of fMRI time seriesNeuroimage2003191477149110.1016/S1053-8119(03)00160-5 Le Thi, H.A.: Analyse numérique des algorithmes de l’optimisation DC. approches locale et globale. codes et simulations numériques en grande dimension. applications. Ph.D. thesis, University of Rouen France (1994) Hu, Z., Nie, F., Wang, R., Li, X.: Low rank regularization: A review. Neural Networks. In Press. Available online 31 October 2020. https://doi.org/10.1016/j.neunet.2020.09.021 (2020) MagnusJRNeudeckerHMatrix differential calculus with applications to simple, hadamard, and kronecker productsJ. Math. Psychol.198529447449281811210.1016/0022-2496(85)90006-9 Le ThiHAPham DinhTDC Programming and DCA: thirty years of developments. Mathematical programming, Special issue: DC Programming - TheoryAlgorithms and Applications201816915681387.90197 TranTTLe ThiHAPham DinhTDC Programming and DCA for enhancing physical layer security via cooperative jammingComput. Oper. Res.201787235244367196610.1016/j.cor.2016.11.003 GC Reinsel (9732_CR44) 1998 HA Le Thi (9732_CR19) 2003; 27 HA Le Thi (9732_CR28) 2017; 249 HA Le Thi (9732_CR30) 2005; 133 E Spyromitros-Xioufis (9732_CR47) 2016; 104 HA Le Thi (9732_CR29) 2014; 26 9732_CR42 TT Tran (9732_CR48) 2017; 87 HA Le Thi (9732_CR36) 2018; 318 TM Cover (9732_CR4) 1988; 9 D He (9732_CR11) 2016; 32 HA Le Thi (9732_CR31) 2017; 18 DN Phan (9732_CR43) 2019; 118 9732_CR46 L Chen (9732_CR2) 2012; 107 HA Le Thi (9732_CR21) 2020; 58 G Salinetti (9732_CR45) 1977; 60 JR Magnus (9732_CR38) 1985; 29 CL Lee (9732_CR37) 2010 HA Le Thi (9732_CR26) 2014; 47 L Harrison (9732_CR10) 2003; 19 S Koshi (9732_CR16) 1985; 14 HA Le Thi (9732_CR22) 2020; 32 AJ Izenman (9732_CR15) 1975; 5 HA Le Thi (9732_CR23) 2019; 73 HA Le Thi (9732_CR35) 2016; 186 9732_CR27 9732_CR25 T Pham Dinh (9732_CR41) 1998; 8 9732_CR24 L Chen (9732_CR3) 2016; 26 M Yuan (9732_CR50) 2007; 69 9732_CR9 9732_CR6 C Eckart (9732_CR7) 1936; 1 E Hyams (9732_CR13) 2013; 27 9732_CR8 9732_CR51 HA Le Thi (9732_CR32) 2018; 169 9732_CR1 H Dev (9732_CR5) 2012; 109 T Pham Dinh (9732_CR40) 1997; 22 HA Le Thi (9732_CR20) 2009; 6 9732_CR14 9732_CR12 MN Nguyen (9732_CR39) 2020; 69 S Wold (9732_CR49) 2001; 58 HM Le (9732_CR17) 2015; 153 9732_CR18 HA Le Thi (9732_CR33) 2015; 244 HA Le Thi (9732_CR34) 2012; 52 |
References_xml | – reference: Pham Dinh, T., Le Thi, H.A.: Recent Advances in DC Programming and DCA. In: Nguyen, N.T., Le Thi, H.A. (eds.) Transactions on Computational Intelligence XIII, vol. 8342, pp 1–37. Springer, Berlin (2014) – reference: LeHMLe ThiHANguyenMCSparse semi-supervised support vector machines by DC programming and DCANeurocomputing2015153627610.1016/j.neucom.2014.11.051 – reference: Le ThiHAHoVTPham DinhTA unified DC programming framework and efficient DCA based approaches for large scale batch reinforcement learningJ. Glob. Optim.2019732279310390836010.1007/s10898-018-0698-y – reference: WoldSSjöströmMErikssonLPLS-Regression: a basic tool of chemometricsChemom. Intell. Lab. Syst.200158210913010.1016/S0169-7439(01)00155-1 – reference: Zălinescu, C.: Convex analysis in general vector spaces. World Scientific (2002) – reference: MagnusJRNeudeckerHMatrix differential calculus with applications to simple, hadamard, and kronecker productsJ. Math. Psychol.198529447449281811210.1016/0022-2496(85)90006-9 – reference: HarrisonLPennyWFristonKMultivariate autoregressive modeling of fMRI time seriesNeuroimage2003191477149110.1016/S1053-8119(03)00160-5 – reference: Le ThiHAPortfolio selection under downside risk measures and cardinality constraints based on DC programming and DCAComput. Manag. Sci.200964459475253483210.1007/s10287-009-0098-3 – reference: Hu, Z., Nie, F., Wang, R., Li, X.: Low rank regularization: A review. Neural Networks. In Press. Available online 31 October 2020. https://doi.org/10.1016/j.neunet.2020.09.021 (2020) – reference: ReinselGCVeluRPMultivariate Reduced-Rank regression: Theory and Applications, 1 edn. Lecture Notes in Statistics 1361998New YorkSpringer10.1007/978-1-4757-2853-8 – reference: Le Thi, H.A., Huynh, V.N., Pham Dinh, T.: Alternating DC Algorithm for Partial DC Programming. Technical report, University of Lorraine (2016) – reference: KoshiSConvergence of convex functions and dualityHokkaido Math. J.198514339941480882010.14492/hokmj/1381757647 – reference: Smith, A.E., Coit, D.W.: Constraint-handling techniques - penalty functions. In: Handbook of Evolutionary Computation, Oxford University Press, pp. C5.2:1–C5.2.6 (1997) – reference: YuanMEkiciALuZMonteiroRDimension reduction and coefficient estimation in multivariate linear regressionJournal of the Royal Statistical Society: Series B (Statistical Methodology)2007693329346232375610.1111/j.1467-9868.2007.00591.x – reference: Foygel, R., Horrell, M., Drton, M., Lafferty, J.: Nonparametric reduced rank regression. In: Pereira, F., Burges, C.J.C., Bottou, L., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, vol. 25, pp 1628–1636. Curran Associates, Inc. (2012) – reference: NguyenMNLe ThiHADanielGNguyenTASmoothing techniques and difference of convex functions algorithms for image reconstructionsOptim.2020697-816011633412090010.1080/02331934.2019.1648467 – reference: Le ThiHANguyenMCPham DinhTA DC programming approach for finding communities in networksNeural Comput.2014261228272854322319210.1162/NECO_a_00673 – reference: ChenLHuangJZSparse reduced-rank regression for simultaneous dimension reduction and variable selectionJ. Am. Stat. Assoc.201210750015331545303641410.1080/01621459.2012.734178 – reference: HeDParidaLKuhnDNovel applications of multitask learning and multiple output regression to multiple genetic trait predictionBioinformatics20163212i37i4310.1093/bioinformatics/btw249 – reference: SalinettiGWetsRJOn the relations between two types of convergence for convex functionsJ. Math. Anal. Appl.197760121122647939810.1016/0022-247X(77)90060-9 – reference: IzenmanAJReduced-rank regression for the multivariate linear modelJ. Multivar. Anal.19755224826437317910.1016/0047-259X(75)90042-1 – reference: Pham DinhTLe ThiHADC Optimization algorithms for solving the trust region subproblemSIAM J. Optim.199882476505161853110.1137/S1052623494274313 – reference: CoverTMThomasADeterminant inequalities via information theorySIAM J. Matrix Anal. Appl.19889338439294893610.1137/0609033 – reference: Le ThiHANguyenMCDCA Based algorithms for feature selection in multi-class support vector machineAnn. Oper. Res.20172491273300360441410.1007/s10479-016-2333-y – reference: Le ThiHATaASPham DinhTAn efficient DCA based algorithm for power control in large scale wireless networksAppl. Math. Comput.201831821522637138571426.68012 – reference: DevHSharmaNLDawsonSNNealDEShahNDetailed analysis of operating time learning curves in robotic prostatectomy by a novice surgeonBJU Int.201210971074108010.1111/j.1464-410X.2011.10665.x – reference: Aldrin, M.: Reduced-Rank Regression. Encyclopedia of Environmetrics, Vol. 3. Wiley, pp. 1724–1728 (2002) – reference: Le ThiHAHoVTOnline learning based on online DCA and application to online classificationNeural Comput.2020324759793410116210.1162/neco_a_01266 – reference: Le Thi, H.A., Le, H.M., Phan, D.N., Tran, B.: Stochastic DCA for the large-sum of non-convex functions problem and its application to group variable selection in classification. In: Proceedings of the 34th International Conference on Machine Learning, Vol. 70. JMLR.org, pp. 3394–3403 (2017) – reference: HyamsEMullinsJPierorazioPPartinAAllafMMatlagaBImpact of robotic technique and surgical volume on the cost of radical prostatectomyJ. Endourol.201327329830310.1089/end.2012.0147 – reference: Le Thi, H.A.: Analyse numérique des algorithmes de l’optimisation DC. approches locale et globale. codes et simulations numériques en grande dimension. applications. Ph.D. thesis, University of Rouen France (1994) – reference: Le ThiHAPham DinhTNgaiHVExact penalty and error bounds in dc programmingJ. Glob. Optim.2012523509535289253410.1007/s10898-011-9765-3 – reference: Le ThiHAPham DinhTDifference of convex functions algorithms (DCA) for image restoration via a Markov random field modelOptim. Eng.2017184873906371910010.1007/s11081-017-9359-0 – reference: Le ThiHADC Programming and DCA for supply chain and production management: state-of-the-art models and methodsInt. J. Prod. Res.202058206078611410.1080/00207543.2019.1657245 – reference: Le ThiHAPham DinhTThe DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problemsAnn. Oper. Res.20051331–4234621193111116.90122 – reference: Le ThiHASolving Large scale molecular distance geometry problems by a smoothing technique via the gaussian transform and D.C. ProgrammingJ. Glob. Optim.200327137539720128121064.90036 – reference: Le Thi, H.A., Huynh, V.N., Pham Dinh, T.: DC Programming and DCA for General DC Programs. In: Van Do, T., Le Thi, H.A., Nguyen, N.T. (eds.) Advanced Computational Methods for Knowledge Engineering, vol. 282, pp 15–35. Springer International Publishing (2014) – reference: Pham DinhTLe ThiHAConvex analysis approach to DC programming: theory, algorithms and applicationsActa Math. Vietnam.199722128935514797510895.90152 – reference: EckartCYoungGThe approximation of one matrix by another of lower rankPsychometrika1936121121810.1007/BF02288367 – reference: Ha, W., Foygel Barber, R.: Alternating minimization and alternating descent over nonconvex sets. ArXiv e-prints arXiv:1709.04451 (2017) – reference: Le ThiHAPham DinhTDC Programming and DCA: thirty years of developments. Mathematical programming, Special issue: DC Programming - TheoryAlgorithms and Applications201816915681387.90197 – reference: Ioffe, A., Tihomirov, V.: Theory of extremal problems. North-Holland (1979) – reference: Le ThiHALeHMPham DinhTNew and efficient DCA based algorithms for minimum sum-of-squares clusteringPattern Recogn.201447138840110.1016/j.patcog.2013.07.012 – reference: Le ThiHAPhanDNDC Programming and DCA for sparse optimal scoring problemNeurocomputing201618617018110.1016/j.neucom.2015.12.068 – reference: LeeCLLeeCAleeJHandbook of Quantitative Finance and Risk Management2010USASpringer10.1007/978-0-387-77117-5 – reference: TranTTLe ThiHAPham DinhTDC Programming and DCA for enhancing physical layer security via cooperative jammingComput. Oper. Res.201787235244367196610.1016/j.cor.2016.11.003 – reference: PhanDNLe ThiHAGroup variable selection via ℓp,0 regularization and application to optimal scoringNeural Netw.201911822023410.1016/j.neunet.2019.05.011 – reference: Le ThiHAPham DinhTLeHMVoXTDC Approximation approaches for sparse optimizationEur. J. Oper. Res.201524412646332076310.1016/j.ejor.2014.11.031 – reference: ChenLHuangJZSparse reduced-rank regression with covariance estimationStat. Comput.2016261461470343938510.1007/s11222-014-9517-6 – reference: Spyromitros-XioufisETsoumakasGGrovesWVlahavasIMulti-target regression via input space expansion: treating targets as inputsMach. Learn.201610415598351328410.1007/s10994-016-5546-z – reference: Dubois, B., Delmas, J.F., Obozinski, G.: Fast algorithms for sparse reduced-rank regression. In: Chaudhuri, K., Sugiyama, M. (eds.) Proceedings of Machine Learning Research, Proceedings of Machine Learning Research, vol. 89, pp 2415–2424. PMLR (2019) – volume: 32 start-page: i37 issue: 12 year: 2016 ident: 9732_CR11 publication-title: Bioinformatics doi: 10.1093/bioinformatics/btw249 – volume: 1 start-page: 211 year: 1936 ident: 9732_CR7 publication-title: Psychometrika doi: 10.1007/BF02288367 – volume: 73 start-page: 279 issue: 2 year: 2019 ident: 9732_CR23 publication-title: J. Glob. Optim. doi: 10.1007/s10898-018-0698-y – volume: 14 start-page: 399 issue: 3 year: 1985 ident: 9732_CR16 publication-title: Hokkaido Math. J. doi: 10.14492/hokmj/1381757647 – volume: 69 start-page: 1601 issue: 7-8 year: 2020 ident: 9732_CR39 publication-title: Optim. doi: 10.1080/02331934.2019.1648467 – ident: 9732_CR46 – volume-title: Multivariate Reduced-Rank regression: Theory and Applications, 1 edn. Lecture Notes in Statistics 136 year: 1998 ident: 9732_CR44 doi: 10.1007/978-1-4757-2853-8 – ident: 9732_CR24 doi: 10.1007/978-3-319-06569-4_2 – volume: 8 start-page: 476 issue: 2 year: 1998 ident: 9732_CR41 publication-title: SIAM J. Optim. doi: 10.1137/S1052623494274313 – ident: 9732_CR51 doi: 10.1142/5021 – volume: 22 start-page: 289 issue: 1 year: 1997 ident: 9732_CR40 publication-title: Acta Math. Vietnam. – ident: 9732_CR8 – ident: 9732_CR14 – ident: 9732_CR42 doi: 10.1007/978-3-642-54455-2_1 – ident: 9732_CR18 – ident: 9732_CR1 doi: 10.1002/9780470057339.var024 – volume: 60 start-page: 211 issue: 1 year: 1977 ident: 9732_CR45 publication-title: J. Math. Anal. Appl. doi: 10.1016/0022-247X(77)90060-9 – volume: 107 start-page: 1533 issue: 500 year: 2012 ident: 9732_CR2 publication-title: J. Am. Stat. Assoc. doi: 10.1080/01621459.2012.734178 – volume: 133 start-page: 23 issue: 1–4 year: 2005 ident: 9732_CR30 publication-title: Ann. Oper. Res. – volume: 58 start-page: 6078 issue: 20 year: 2020 ident: 9732_CR21 publication-title: Int. J. Prod. Res. doi: 10.1080/00207543.2019.1657245 – volume: 26 start-page: 461 issue: 1 year: 2016 ident: 9732_CR3 publication-title: Stat. Comput. doi: 10.1007/s11222-014-9517-6 – ident: 9732_CR9 – volume: 318 start-page: 215 year: 2018 ident: 9732_CR36 publication-title: Appl. Math. Comput. doi: 10.1016/j.amc.2017.08.061 – volume: 244 start-page: 26 issue: 1 year: 2015 ident: 9732_CR33 publication-title: Eur. J. Oper. Res. doi: 10.1016/j.ejor.2014.11.031 – volume: 169 start-page: 5 issue: 1 year: 2018 ident: 9732_CR32 publication-title: Algorithms and Applications – volume: 87 start-page: 235 year: 2017 ident: 9732_CR48 publication-title: Comput. Oper. Res. doi: 10.1016/j.cor.2016.11.003 – volume: 6 start-page: 459 issue: 4 year: 2009 ident: 9732_CR20 publication-title: Comput. Manag. Sci. doi: 10.1007/s10287-009-0098-3 – volume: 58 start-page: 109 issue: 2 year: 2001 ident: 9732_CR49 publication-title: Chemom. Intell. Lab. Syst. doi: 10.1016/S0169-7439(01)00155-1 – volume: 118 start-page: 220 year: 2019 ident: 9732_CR43 publication-title: Neural Netw. doi: 10.1016/j.neunet.2019.05.011 – ident: 9732_CR25 – volume: 249 start-page: 273 issue: 1 year: 2017 ident: 9732_CR28 publication-title: Ann. Oper. Res. doi: 10.1007/s10479-016-2333-y – ident: 9732_CR6 – ident: 9732_CR12 doi: 10.1016/j.neunet.2020.09.021 – volume: 29 start-page: 474 issue: 4 year: 1985 ident: 9732_CR38 publication-title: J. Math. Psychol. doi: 10.1016/0022-2496(85)90006-9 – volume: 27 start-page: 375 issue: 1 year: 2003 ident: 9732_CR19 publication-title: J. Glob. Optim. – volume: 32 start-page: 759 issue: 4 year: 2020 ident: 9732_CR22 publication-title: Neural Comput. doi: 10.1162/neco_a_01266 – volume: 52 start-page: 509 issue: 3 year: 2012 ident: 9732_CR34 publication-title: J. Glob. Optim. doi: 10.1007/s10898-011-9765-3 – volume: 18 start-page: 873 issue: 4 year: 2017 ident: 9732_CR31 publication-title: Optim. Eng. doi: 10.1007/s11081-017-9359-0 – volume: 69 start-page: 329 issue: 3 year: 2007 ident: 9732_CR50 publication-title: Journal of the Royal Statistical Society: Series B (Statistical Methodology) doi: 10.1111/j.1467-9868.2007.00591.x – volume: 26 start-page: 2827 issue: 12 year: 2014 ident: 9732_CR29 publication-title: Neural Comput. doi: 10.1162/NECO_a_00673 – volume: 186 start-page: 170 year: 2016 ident: 9732_CR35 publication-title: Neurocomputing doi: 10.1016/j.neucom.2015.12.068 – volume-title: Handbook of Quantitative Finance and Risk Management year: 2010 ident: 9732_CR37 doi: 10.1007/978-0-387-77117-5 – volume: 109 start-page: 1074 issue: 7 year: 2012 ident: 9732_CR5 publication-title: BJU Int. doi: 10.1111/j.1464-410X.2011.10665.x – volume: 9 start-page: 384 issue: 3 year: 1988 ident: 9732_CR4 publication-title: SIAM J. Matrix Anal. Appl. doi: 10.1137/0609033 – volume: 153 start-page: 62 year: 2015 ident: 9732_CR17 publication-title: Neurocomputing doi: 10.1016/j.neucom.2014.11.051 – ident: 9732_CR27 – volume: 104 start-page: 55 issue: 1 year: 2016 ident: 9732_CR47 publication-title: Mach. Learn. doi: 10.1007/s10994-016-5546-z – volume: 19 start-page: 1477 year: 2003 ident: 9732_CR10 publication-title: Neuroimage doi: 10.1016/S1053-8119(03)00160-5 – volume: 5 start-page: 248 issue: 2 year: 1975 ident: 9732_CR15 publication-title: J. Multivar. Anal. doi: 10.1016/0047-259X(75)90042-1 – volume: 27 start-page: 298 issue: 3 year: 2013 ident: 9732_CR13 publication-title: J. Endourol. doi: 10.1089/end.2012.0147 – volume: 47 start-page: 388 issue: 1 year: 2014 ident: 9732_CR26 publication-title: Pattern Recogn. doi: 10.1016/j.patcog.2013.07.012 |
SSID | ssj0009686 |
Score | 2.2913795 |
Snippet | We study a challenging problem in machine learning that is the reduced-rank multitask linear regression with covariance matrix estimation. The objective is to... |
SourceID | hal proquest gale crossref springer |
SourceType | Open Access Repository Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 809 |
SubjectTerms | Algorithms Artificial Intelligence Complex Systems Computer Science Convexity Covariance matrix Critical point Machine learning Mathematics Regression analysis Regression models |
SummonAdditionalLinks | – databaseName: SpringerLink Journals (ICM) dbid: U2A link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlZ1LT9wwEIBHPC7tgXdFykMWqsSBWsomsb0-Rjy0QsAJJE61Ej9AAgJit6g_vzOJswtVQeIaO85jPPbYnvkG4IdTokJFqHnQUvCiDhmqlNZcFqq2hZQuZBTgfH4hR1fF6bW4jkFh497bvT-SbEfqV8Fuhco4uRS0iBk-nIdFQWt37MVXWTlD7co2vyOBqzhOXnkMlfl_G2-mozgoz9-ST-Qrg_OfM9J26jlZgaVoM7KyE_IqzPlmDZb7fAwsqucafD2fMljH6_CrvI-bfc0NOzosGZqn7JlIrd5xStXOWmfCSTW-Y2RrVlR607nFNoz2Z5l9fMGlNPUL9kAo_z-MmBxdsOMGXJ4cXx6OeMymwC3aQBOeey2s0spZ5VObuqJQLhQiiBBylTqVWy1RuXUQvka7wqd55mUYiFCprB6q_BssNI-N3wSGJhxRy2rpncMhoNa4xMOVY4oNe41jQAKD_p8aG0njlPDi3swYySQHg3IwrRzMMIGD6T1PHWfjw9r7JCpDSogt2yrGEuD7Ec7KlIQhVASeSWAPpTltkmDao_LM0DX8QJycU_EySGC7F7aJKjw2mSa2PdGBEvjZd4BZ8fsv9_1z1bfgCz2m81vbhoXJ82-_g4bOpN5t-_VfmoDw7w priority: 102 providerName: Springer Nature |
Title | Alternating DCA for reduced-rank multitask linear regression with covariance matrix estimation |
URI | https://link.springer.com/article/10.1007/s10472-021-09732-8 https://www.proquest.com/docview/2918202283 https://hal.univ-lorraine.fr/hal-03212805 |
Volume | 90 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfZ3db9MwEMBPdH2BBz4GiIxRWQiJB7BIkziOn1A62lXAKoQ2abxgJf4YEls61jLx53OXOC0gsadWdWslvQ-fnbvfAbywUlRoCDX3Khc8q32CJqUUzzNZmyzPrU-owPlokc9Psven4jQcuK1CWmXvE1tHbZeGzsjfJIpQ4wRreXv5g1PXKHq6GlpoDGCILrjAzddwMl18-rzF7uZtr0eCWHFcyNJQNhOK5zKZcEpRaJE1vPhraQoOevCN8iP_CD7_eV7aLkOz-3A3xI-s7AT-AG65Zhfu9b0ZWDDVXbhztOGxrh7C1_I8HPw1Z-zdQckwVGVXRG11llPbdtYmFq6r1XdGcWdFo2ddimzD6KyWmeU1bqtJR9gFYf1_MeJzdIWPj-B4Nj0-mPPQWYEbjIfWPHVKGKmkNdLFJrZZJq3PhBfepzK2MjUqR0NXXrgaYwwXp4nL_Vj4SiZ1IdPHsNMsG_cEGIZzRDCrc2ctuoNa4XYPd5ExTuwU-oMIxv1_qk2gjlPzi3O95SWTHDTKQbdy0EUErza_ueyYGzd--yWJSpNB4symCnUFeH2EttIlIQklQWgieI7S3ExJYO15-VHTZ3iDqD-xuB5HsN8LWwdzXumt8kXwuleA7fD_L27v5tmewm160-Ws7cPO-uqne4ZBzroewaCYHY5gWM4mkwW9Hn75MB0F_cbRk6T8Dc4J-j0 |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB615QAcKBQQgQIWAnEAq9nEideHCkUty5bu9rRIPWElfhSpbbZ0lwI_qv-xM4mzC0j01mucjBzP0_bMNwCvrcxKVISKe5VnXFQ-QZVSiudCVkbkufUJFTiPD_LhF_H5MDtcgcuuFobSKjub2BhqOzV0Rr6VKIIaJ7CWD2ffOXWNotvVroVGKxb77vdP3LLNtvd2kb9vkmTwcbIz5KGrADcYC8x56lRmpJLWSBeb2AohrReZz7xPZWxlalSOQq585ir0ry5OE5f7XuZLmVR9mSLZVbgl0lSRQvUHn5YYv3nTWJIQszh6zTTU6IRKPSETTvkQDT4O7__lB4M3WP1GyZh_RLr_XM42Pm9wH-6FYJUVrXQ9gBVXb8B61wiCBbuwAXfHC_DX2UP4WpyEU8b6iO3uFAzjYnZOELHOcuoRz5osxnk5O2YU5JY0etTm49aMDoaZmV7gHp4Ekp1SD4FfjMBA2irLRzC5iQV_DGv1tHZPgGHsSHBpVe6sRdtTKdxb4pY1RsJOofGJoNetqTYB4pw6bZzoJTgz8UEjH3TDB92P4N3im7MW4OPat98SqzRpP1I2ZShiwPkRjpYuCP9QEuJNBK-QmwuShOI9LEaanuEPYlQQZxe9CDY7ZutgO2Z6KekRvO8EYDn8_8k9vZ7aS7g9nIxHerR3sP8M7tDDNlluE9bm5z_cc4yu5tWLRqYZ6BvWoSucCzAn |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlZ3dTxQxEMAngonRB0GUuILYEBMftGFvd9teHzfg5VQgPEDCk81uPyABF8KtxD_fmd3u3WnUxNdtt_sxnXbazvwG4K1TokJFqHnQUvCiDhmqlNZcFqq2hZQuZBTgfHQsp2fF53NxvhTF33m7D0eSfUwDUZqadu_Whb2lwLdCZZzcCzrcDB-vwEMcjkfUr8-ycoHdlV2uR4JYcZzI8hg28-c2fpma4gC9ckn-kUvG52_npd00NFmHp9F-ZGUv8GfwwDcbsDbkZmBRVTfgydGcxzp7Dl_L67jx11ywg_2SoanK7oja6h2ntO2scyxsq9kVI7uzotKL3kW2YbRXy-zNPS6rqY-wb4T1_8GIz9EHPr6A08nH0_0pj5kVuEV7qOW518IqrZxVPrWpKwrlQiGCCCFXqVO51RIVXQfha7QxfJpnXoaRCJXK6rHKN2G1uWn8S2BozhHBrJbeORwOao3LPVxFptiw1zgeJDAa_qmxkTpOyS-uzYKXTHIwKAfTycGME3g_v-e2Z278s_Y7EpUhhcSWbRXjCvD9CG1lSkISKoLQJLCL0pw3SWDtaXlo6Bp-IE7UqbgfJbA9CNtEdZ6ZTBPnnkhBCXwYOsCi-O8v9-r_qr-BRycHE3P46fjLFjymJ_bubNuw2t5996_R_mnrna6L_wRRsPge |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Alternating+DCA+for+reduced-rank+multitask+linear+regression+with+covariance+matrix+estimation&rft.jtitle=Annals+of+mathematics+and+artificial+intelligence&rft.au=Le+Thi%2C+Hoai+An&rft.au=Ho%2C+Vinh+Thanh&rft.date=2022-09-01&rft.pub=Springer+Nature+B.V&rft.issn=1012-2443&rft.eissn=1573-7470&rft.volume=90&rft.issue=7-9&rft.spage=809&rft.epage=829&rft_id=info:doi/10.1007%2Fs10472-021-09732-8 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1012-2443&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1012-2443&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1012-2443&client=summon |