A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization
In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate grad...
Saved in:
Published in | Computational optimization and applications Vol. 89; no. 3; pp. 843 - 894 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.12.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of
O
~
(
ϵ
-
11
/
2
)
and an operation complexity of
O
~
(
ϵ
-
11
/
2
min
{
n
,
ϵ
-
5
/
4
}
)
for finding an
(
ϵ
,
ϵ
)
-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to
O
~
(
ϵ
-
7
/
2
)
and
O
~
(
ϵ
-
7
/
2
min
{
n
,
ϵ
-
3
/
4
}
)
, respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality. |
---|---|
AbstractList | In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of
O
~
(
ϵ
-
11
/
2
)
and an operation complexity of
O
~
(
ϵ
-
11
/
2
min
{
n
,
ϵ
-
5
/
4
}
)
for finding an
(
ϵ
,
ϵ
)
-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to
O
~
(
ϵ
-
7
/
2
)
and
O
~
(
ϵ
-
7
/
2
min
{
n
,
ϵ
-
3
/
4
}
)
, respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality. In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of O(& varepsilon;(-11/2)) and an operation complexity of O(& varepsilon;(-11/2)min{n,& varepsilon;(-5/4)}) for finding an (& varepsilon;,root & varepsilon;)-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to O(& varepsilon;(-7/2)) and O(& varepsilon;(-7/2m)in{n,& varepsilon;(-3/4)}), respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality. |
Author | Lu, Zhaosong He, Chuan Huang, Heng |
Author_xml | – sequence: 1 givenname: Chuan surname: He fullname: He, Chuan organization: Department of Computer Science and Engineering, University of Minnesota, Department of Mathematics, Linköping University – sequence: 2 givenname: Heng surname: Huang fullname: Huang, Heng organization: Department of Computer Science, University of Maryland – sequence: 3 givenname: Zhaosong orcidid: 0000-0003-3277-7853 surname: Lu fullname: Lu, Zhaosong email: zhaosong@umn.edu organization: Department of Industrial and Systems Engineering, University of Minnesota |
BackLink | https://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-207432$$DView record from Swedish Publication Index |
BookMark | eNp9kE1OwzAQhS1UJErhAqxyAcPYSZxkWRUoSBVsgCWWnUyCq8au7IQCp8fQiiWbedLovfn5TsnEOouEXDC4ZADFVWCQlxUFnlEAASkVR2TK8iKlvKyyCZlCxQUVAOkJOQ1hDQBVkfIpeZ0nD7gbnKWLZaJVwCZW7w16qsauRzvEzkp1XtnOKJv0OLy5JmmdTzq06NUmiafUzr7jRxLF1InbDqY3X2owzp6R41ZtAp4fdEaeb2-eFnd09bi8X8xXtOYlDBSLpmpUJliRsRozxcoS8yqtGkCNuY5a6EJjwxqmmpqJkutSK8ixEiJvRZbOCN3PDTvcjlpuvemV_5ROGXltXubS-U5uzCg5FFnKo5_v_bV3IXhs_xIM5A9RuScqI1H5S1SKGEoPS6LZdujl2o3exr_-S30DL1985g |
Cites_doi | 10.1016/j.jco.2018.11.001 10.1109/TSP.2009.2016892 10.1023/A:1015451203254 10.1007/s10107-011-0452-4 10.1007/978-3-030-12767-1_2 10.1137/1.9781611973365 10.1007/s10107-009-0286-5 10.1137/060654797 10.1023/A:1008705028512 10.1023/A:1008677427361 10.1137/0613066 10.1007/s10107-019-01362-7 10.1137/16M110280X 10.1007/s10107-004-0560-5 10.1137/15M1031631 10.1137/130915546 10.1145/3055399.3055464 10.1007/s10107-006-0706-8 10.1007/s10957-017-1071-x 10.1137/18M1216146 10.1137/15M1052834 10.1007/s10898-016-0475-8 10.1137/0805038 10.1109/TGRS.2010.2068053 10.1090/S0025-5718-97-00777-1 10.1007/s10107-014-0753-5 10.1007/s00186-017-0625-x 10.1137/22M1489824 10.1137/17M1134329 10.1023/A:1013764800871 10.1007/s10107-018-1290-4 10.1137/17M1114296 10.1109/TGRS.2006.888466 10.1137/21M1457011 10.1137/16M1108650 10.1093/imanum/draa021 10.1080/10556788.2018.1528250 10.1007/s10589-017-9937-2 10.1093/imanum/drx011 10.1137/19M130563X 10.2307/j.ctvcm4hcj 10.1137/17M1113898 10.1093/imanum/drz074 10.1007/s12532-015-0082-6 10.1137/0724076 10.1109/TSP.2019.2937282 10.1137/120869687 10.1109/TGRS.2009.2038483 10.1109/ICASSP43922.2022.9747124 10.1007/s10107-012-0617-9 10.1007/s10107-002-0360-8 10.1137/080718206 10.1007/s10107-004-0559-y 10.1016/j.orl.2017.09.005 10.1137/1.9781611970791 |
ContentType | Journal Article |
Copyright | The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
Copyright_xml | – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
DBID | AAYXX CITATION ADTPV AOWAS DG8 |
DOI | 10.1007/s10589-024-00603-6 |
DatabaseName | CrossRef SwePub SwePub Articles SWEPUB Linköpings universitet |
DatabaseTitle | CrossRef |
DatabaseTitleList | |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Statistics Mathematics |
EISSN | 1573-2894 |
EndPage | 894 |
ExternalDocumentID | oai_DiVA_org_liu_207432 10_1007_s10589_024_00603_6 |
GrantInformation_xml | – fundername: National Science Foundation grantid: IIS-2211491; IIS-2211492 funderid: http://dx.doi.org/10.13039/100000001 |
GroupedDBID | -Y2 -~C .4S .86 .DC .VR 06D 0R~ 0VY 1N0 1SB 2.D 203 28- 29F 2J2 2JN 2JY 2KG 2KM 2LR 2P1 2VQ 2~H 30V 4.4 406 408 409 40D 40E 5GY 5QI 5VS 67Z 6NX 7WY 88I 8AO 8FE 8FG 8FL 8FW 8TC 8UJ 8VB 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAPKM AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN AAYTO AAYZH ABAKF ABBBX ABBRH ABBXA ABDBE ABDZT ABECU ABFTV ABHLI ABHQN ABJCF ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABUWG ABWNU ABXPI ACAOD ACBXY ACDTI ACGFS ACGOD ACHSB ACHXU ACIWK ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACSNA ACZOJ ADHHG ADHIR ADHKG ADKNI ADKPE ADMLS ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFDZB AFEXP AFGCZ AFKRA AFLOW AFOHR AFQWF AFWTZ AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGQPQ AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHPBZ AHQJS AHSBF AHYZX AI. AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ AKVCP ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMVHM AMXSW AMYLF AMYQR AOCGG ARAPS ARCSS ARMRJ ASPBG ATHPR AVWKF AXYYD AYFIA AYJHY AZFZN AZQEC B-. BA0 BAPOH BBWZM BDATZ BENPR BEZIV BGLVJ BGNMA BPHCQ BSONS CAG CCPQU COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP DU5 DWQXO EBLON EBS EBU EDO EIOEI EJD ESBYG F5P FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRNLG FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNUQQ GNWQR GQ7 GQ8 GROUPED_ABI_INFORM_RESEARCH GXS H13 HCIFZ HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I-F I09 IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ K1G K60 K6V K6~ K7- KDC KOV KOW L6V LAK LLZTM M0C M2P M4Y M7S MA- N2Q N9A NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM OVD P19 P2P P62 P9R PF0 PHGZM PHGZT PQBIZ PQBZA PQQKQ PROAC PT4 PT5 PTHSS Q2X QOK QOS QWB R4E R89 R9I RHV RNI RNS ROL RPX RSV RZC RZD RZK S16 S1Z S26 S27 S28 S3B SAP SCLPG SDD SDH SDM SHX SISQX SJYHP SMT SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TEORI TH9 TN5 TSG TSK TSV TUC TUS U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW VH1 W23 W48 WK8 YLTOR Z45 ZL0 ZMTXR ZWQNP ~8M ~EX AAYXX ABFSG ACSTC AEZWR AFHIU AHWEU AIXLP CITATION ABRTQ ADTPV AOWAS DG8 PQGLB |
ID | FETCH-LOGICAL-c280t-e7d9da461741ce4a188e5939d0ebe5b9d07b7bed1d1adc1682b8ba05e9665f643 |
IEDL.DBID | U2A |
ISSN | 0926-6003 1573-2894 |
IngestDate | Thu Aug 21 07:07:29 EDT 2025 Tue Jul 01 00:44:28 EDT 2025 Thu May 22 04:31:18 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Keywords | Augmented Lagrangian method 68Q25 Operation complexity 90C26 Newton-conjugate gradient method Iteration complexity Barrier method 90C30 Nonconvex conic optimization 49M05 49M15 Second-order stationary point 90C60 |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c280t-e7d9da461741ce4a188e5939d0ebe5b9d07b7bed1d1adc1682b8ba05e9665f643 |
ORCID | 0000-0003-3277-7853 |
PageCount | 52 |
ParticipantIDs | swepub_primary_oai_DiVA_org_liu_207432 crossref_primary_10_1007_s10589_024_00603_6 springer_journals_10_1007_s10589_024_00603_6 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2024-12-01 |
PublicationDateYYYYMMDD | 2024-12-01 |
PublicationDate_xml | – month: 12 year: 2024 text: 2024-12-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationSubtitle | An International Journal |
PublicationTitle | Computational optimization and applications |
PublicationTitleAbbrev | Comput Optim Appl |
PublicationYear | 2024 |
Publisher | Springer US |
Publisher_xml | – name: Springer US |
References | HeCLuZPongTKA Newton-CG based augmented Lagrangian method for finding a second-order stationary point of nonconvex equality constrained optimization with complexity guaranteesSIAM J. Optim.2023333173417664622416 RuszczynskiANonlinear optimization2011NJPrinceton University Press MiaoLQiHEndmember extraction from highly mixed data using minimum volume constrained nonnegative matrix factorizationIEEE Trans. Geosci. Remote Sens.2007453765777 M. F. Sahin, A. Eftekhari, A. Alacaoglu, F. Latorre, and V. Cevher. An inexact augmented Lagrangian framework for nonconvex optimization with nonlinear constraints. Adv. Neural Info. Process. Syst. 32, 632–650 (2019) MartínezJMRaydanMCubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimizationJ. Glob. Optim.20176823673853649544 BianWChenXYeYComplexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimizationMath. Program.201514913013273300465 RoyerCWO’NeillMWrightSJA Newton-CG algorithm with complexity guarantees for smooth unconstrained optimizationMath. Program.202018014514884062843 HaeserGLiuHYeYOptimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundaryMath. Program.201917812632994019951 Allen-Zhu, Z., Li, Y.: Neon2: finding local minima via first-order oracles. Adv. Neural Info. Process. Syst. 31, 3716–3726 (2018) ArmandPOmheniRA mixed logarithmic barrier-augmented Lagrangian method for nonlinear optimizationJ. Optim. Theory Appl.201717325235473634802 Carmon,Y., Duchi, J.C., Hinder, O., Sidford, A.:“Convex until proven guilty": dimension-free acceleration of gradient descent on non-convex functions. In: International conference on machine learning, pp. 654–663. PMLR, (2017) CartisCGouldNITointPLDemetriouI PardalosPEvaluation complexity bounds for smooth constrained nonlinear optimization using scaled KKT conditions, high-order models and the criticality measure χ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\chi $$\end{document}Approximation and Optimization: Algorithms, Complexity and Applications2019NYSpringer526 CartisC GouldNITointPLOptimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimizationJ. Complex.20195368943953087 O’NeillMWrightSJA log-barrier Newton-CG method for bound constrained optimization with complexity guaranteesIMA J. Numer. Anal.2021411841214205053 KuczyńskiJWoźniakowskiHEstimating the largest eigenvalue by the power and Lanczos algorithms with a random startSIAM J. Matrix Anal. Appl.1992134109411221182715 ArmandPTranNNRapid infeasibility detection in a mixed logarithmic barrier-augmented Lagrangian method for nonlinear optimizationOptim. Methods Softw.201934599110134002763 BonnansJFLaunayGSequential quadratic programming with penalization of the displacementSIAM J. Optim.1995547928121358805 GrapigliaGNYuanY-XOn the complexity of an augmented Lagrangian method for nonconvex optimizationIMA J. Numer. Anal.2021412154615684246883 NesterovYPolyakBTCubic regularization of Newton method and its global performanceMath. Program.200610811772052229459 ZhaoXSunDTohKCA Newton-CG augmented Lagrangian method for semidefinite programmingSIAM J. Optim.2010204173717652600237 CartisCGouldNITointPLOn the evaluation complexity of constrained nonlinear least-squares and general constrained nonlinear optimization using second-order methodsSIAM J. Numer. Anal.20155328368513325759 LuSRazaviyaynMYangBHuangKHongMFinding second-order stationary points efficiently in smooth nonconvex linearly constrained optimization problemsAdv. Neural Info. Process. Syst.20203328112822 AndreaniRHaeserGRamosASilvaPJA second-order sequential optimality condition associated to the convergence of optimization algorithmsIMA J. Numer. Anal.2017374190219293712179 CartisCGouldNITointPLOn the evaluation complexity of cubic regularization methods for potentially rank-deficient nonlinear least-squares problems and its relevance to constrained nonlinear optimizationSIAM J. Optim.2013233155315743084179 ColemanTFLiuJYuanWA new trust-region algorithm for equality constrained optimizationComput. Optim. Appl.20022121771991883753 GoldfarbDPolyakRScheinbergKYuzefovichIA modified barrier-augmented Lagrangian method for constrained minimizationComput. Optim. Appl.199914155741704946 YangLSunDTohKCSDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraintsMath. Program. Comput.2015733313663384939 KuhlmannRBüskensCA primal-dual augmented Lagrangian penalty-interior-point filter line search algorithmMath. Method Oper. Res.20188734514833814593 CurtisFE RobinsonDPSamadiMComplexity analysis of a trust funnel algorithm for equality constrained optimizationSIAM J. Optim.2018282153315633799074 P. Dvurechensky and M. Staudigl. Hessian barrier algorithms for non-convex conic optimization. (2021) arXiv:2111.00100 Xu, Y., Jin, R., Yang, Y.: NEON+: Accelerated gradient methods for extracting negative curvature for non-convex optimization. (2017) arXiv:1712.01033 MoguerzaJMPrietoFJAn augmented Lagrangian interior-point method using directions of negative curvatureMath. Program.20039535736161969766 AndreaniRBirginEGMartínezJMSchuverdtMLOn augmented Lagrangian methods with general lower-level constraintsSIAM J. Optim.2008184128613092373302 RoyerCWWrightSJComplexity analysis of second-order line-search algorithms for smooth nonconvex optimizationSIAM J. Optim.2018282144814773799071 BirginEGGardenghiJMartínezJMSantosSATointPLEvaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order modelsSIAM J. Optim.20162629519673484405 Bhojanapalli, S., Neyshabur, B., Srebro, N.: Global optimality of local search for low rank matrix recovery. Adv. Neural Info. Process. Syst. 29, 3873–3881 (2016) KanzowCSteckDAn example comparing the standard and safeguarded augmented Lagrangian methodsOper. Res. Lett.20174565986033724190 VanderbeiRJShannoDFAn interior-point algorithm for nonconvex nonlinear programmingComput. Optim. Appl.19991312312521704122 WaltzRAMoralesJLNocedalJOrbanDAn interior algorithm for nonlinear optimization that combines line search and trust region stepsMath. Program.200610733914082221821 WrightSJNowakRDFigueiredoMASparse reconstruction by separable approximationIEEE Trans. Signal Process.2009577247924932650165 BirginEGHaeserGRamosAAugmented Lagrangians with constrained subproblems and convergence to second-order stationary pointsComput. Optim. and Appl.201869151753742299 Park, D., Kyrillidis, A., Carmanis, C., Sanghavi, S.: Non-square matrix sensing without spurious local minima via the burer-monteiro approach. In Artificial intelligence and statistics, pp. 65–74. PMLR, (2017) Bertsekas, D.P.: Nonlinear Programming. Athena Scientific. (1995) HuckAGuillaumeMBlanc-TalonJMinimum dispersion constrained nonnegative matrix factorization to unmix hyperspectral dataIEEE Trans. Geosci. Remote Sens.201048625902602 CartisCGouldNITointPLAdaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical resultsMath. Program.201112722452952776701 C. Jin, P. Netrapalli, and M. I. Jordan. Accelerated gradient descent escapes saddle points faster than gradient descent. In: Conference on learning theory, pp. 1042–1085. (2018) LiuXXiaWWangBZhangLAn approach based on constrained nonnegative matrix factorization to unmix hyperspectral dataIEEE Trans. Geosci. Remote Sens.2010492757772 BuenoLFMartínezJMOn the complexity of an inexact restoration method for constrained optimizationSIAM J. Optim.2020301801014046784 ConnARGouldNITointPLA globally convergent Lagrangian barrier algorithm for optimization with general inequality constraints and simple boundsMath. Comput.1997662172612881370850 Agarwal, N., Allen-Zhu, Z., Bullins, B., Hazan, E., Ma, T.: Finding approximate local minima faster than gradient descent. In: Proceedings of the 49th annual ACM SIGACT symposium on theory of computing, pp. 1195–1199 (2017) Birgin, E. G., Martínez, J.M.: Practical augmented Lagrangian methods for constrained optimization. SIAM (2014) CartisCGouldNITointPLOn the complexity of finding first-order critical points in constrained nonlinear optimizationMath. Program.20141441931063179956 LuZZhangYAn augmented Lagrangian approach for sparse principal component analysisMath. Program.201213511491932968253 CurtisFERobinsonDPRoyerCWWrightSJTrust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimizationSIAM J. Optim.20213115185444211955 ChiYLuYMChenYNonconvex optimization meets low-rank matrix factorization: an overviewIEEE Trans. Signal Process.20196720523952694016283 CarmonYDuchiJGradient descent finds the cubic-regularized nonconvex newton stepSIAM J. Optim.2019293214621784000224 Xie, P., Wright, S.J.: Complexity of projected Newton methods for bound-constrained optimization. (2021) arXiv:2103.15989 Cifuentes, D., Moitra, A.: Polynomial time guarantees for the Burer-Monteiro method. (2019) arXiv:1912.01745 F. Goyens, A. Eftekhari, and N. Boumal. Computing second-order points under equality constraints: revisiting Fletcher’s augmented Lagrangian. (2022) arXiv:2204.01448 CurtisFERobinsonDPSamadiMA trust region algorithm with a worst-case iteration complexity of O(ϵ-3/2)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\cal{O} (\epsilon ^{-3/2})$$\end{document} for nonconvex optimizationMath. Program.201611621323612930 ByrdRHSchnabelRBShultzGAA trust region algorithm for nonlinearly constrained optimizationSIAM J. Numer. Anal.198724511521170909071 NesterovYNemirovskiiAInterior-point polynomial algorit JM Martínez (603_CR53) 2017; 68 R Andreani (603_CR4) 2017; 37 GN Grapiglia (603_CR41) 2021; 41 603_CR19 P Armand (603_CR7) 2019; 34 Y Carmon (603_CR20) 2018; 28 C He (603_CR44) 2023; 33 603_CR59 A Wächter (603_CR66) 2006; 106 603_CR13 J Kuczyński (603_CR48) 1992; 13 C Cartis (603_CR25) 2015; 53 Z Lu (603_CR52) 2012; 135 P Armand (603_CR6) 2017; 173 G Haeser (603_CR42) 2019; 178 LF Bueno (603_CR16) 2020; 30 Y Nesterov (603_CR56) 1994 RA Waltz (603_CR67) 2006; 107 C Cartis (603_CR27) 2019; 53 A Ruszczynski (603_CR62) 2011 EP De Carvalho (603_CR37) 2008; 200 M O’Neill (603_CR58) 2021; 41 JF Bonnans (603_CR15) 1995; 5 R Andreani (603_CR3) 2008; 18 L Miao (603_CR54) 2007; 45 M Argáez (603_CR5) 2002; 114 C Kanzow (603_CR47) 2017; 45 603_CR69 C Cartis (603_CR22) 2013; 23 D Goldfarb (603_CR39) 1999; 14 Y Chi (603_CR29) 2019; 67 SJ Wright (603_CR68) 2009; 57 Y Carmon (603_CR18) 2019; 29 S Lu (603_CR51) 2020; 33 RH Byrd (603_CR17) 1987; 24 CW Royer (603_CR60) 2020; 180 EG Birgin (603_CR12) 2018; 69 FE Curtis (603_CR34) 2021; 31 RJ Vanderbei (603_CR65) 1999; 13 603_CR63 603_CR64 EG Birgin (603_CR11) 2016; 26 603_CR71 Y Nesterov (603_CR57) 2006; 108 L Yang (603_CR72) 2015; 7 Y Xie (603_CR70) 2021; 86 TF Coleman (603_CR31) 2002; 21 AR Conn (603_CR33) 1997; 66 603_CR9 R Kuhlmann (603_CR49) 2018; 87 JM Moguerza (603_CR55) 2003; 95 C Cartis (603_CR23) 2014; 144 603_CR38 603_CR8 FE Curtis (603_CR35) 2016; 1 603_CR30 C Cartis (603_CR21) 2011; 127 X Zhao (603_CR73) 2010; 20 C Cartis (603_CR24) 2014; 144 CW Royer (603_CR61) 2018; 28 603_CR1 X Chen (603_CR28) 2017; 55 603_CR2 AR Conn (603_CR32) 2013 C He (603_CR43) 2023; 33 W Bian (603_CR10) 2015; 149 FE Curtis (603_CR36) 2018; 28 A Huck (603_CR45) 2010; 48 603_CR46 X Liu (603_CR50) 2010; 49 EG Birgin (603_CR14) 2017; 27 603_CR40 C Cartis (603_CR26) 2019 |
References_xml | – reference: MartínezJMRaydanMCubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimizationJ. Glob. Optim.20176823673853649544 – reference: YangLSunDTohKCSDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraintsMath. Program. Comput.2015733313663384939 – reference: ChiYLuYMChenYNonconvex optimization meets low-rank matrix factorization: an overviewIEEE Trans. Signal Process.20196720523952694016283 – reference: MiaoLQiHEndmember extraction from highly mixed data using minimum volume constrained nonnegative matrix factorizationIEEE Trans. Geosci. Remote Sens.2007453765777 – reference: CarmonYDuchiJGradient descent finds the cubic-regularized nonconvex newton stepSIAM J. Optim.2019293214621784000224 – reference: P. Dvurechensky and M. Staudigl. Hessian barrier algorithms for non-convex conic optimization. (2021) arXiv:2111.00100 – reference: O’NeillMWrightSJA log-barrier Newton-CG method for bound constrained optimization with complexity guaranteesIMA J. Numer. Anal.2021411841214205053 – reference: HeCLuZA Newton-CG based barrier method for finding a second-order stationary point of nonconvex conic optimization with complexity guaranteesSIAM J. Optim.2023332119112224605926 – reference: VanderbeiRJShannoDFAn interior-point algorithm for nonconvex nonlinear programmingComput. Optim. Appl.19991312312521704122 – reference: AndreaniRBirginEGMartínezJMSchuverdtMLOn augmented Lagrangian methods with general lower-level constraintsSIAM J. Optim.2008184128613092373302 – reference: BuenoLFMartínezJMOn the complexity of an inexact restoration method for constrained optimizationSIAM J. Optim.2020301801014046784 – reference: CurtisFE RobinsonDPSamadiMComplexity analysis of a trust funnel algorithm for equality constrained optimizationSIAM J. Optim.2018282153315633799074 – reference: WrightSJNowakRDFigueiredoMASparse reconstruction by separable approximationIEEE Trans. Signal Process.2009577247924932650165 – reference: Agarwal, N., Allen-Zhu, Z., Bullins, B., Hazan, E., Ma, T.: Finding approximate local minima faster than gradient descent. In: Proceedings of the 49th annual ACM SIGACT symposium on theory of computing, pp. 1195–1199 (2017) – reference: CurtisFERobinsonDPRoyerCWWrightSJTrust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimizationSIAM J. Optim.20213115185444211955 – reference: ArgáezMTapiaROn the global convergence of a modified augmented Lagrangian linesearch interior-point Newton method for nonlinear programmingJ. Optim. Theory Appl200211411251910852 – reference: Cifuentes, D., Moitra, A.: Polynomial time guarantees for the Burer-Monteiro method. (2019) arXiv:1912.01745 – reference: CartisCGouldNITointPLAdaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical resultsMath. Program.201112722452952776701 – reference: Birgin, E. G., Martínez, J.M.: Practical augmented Lagrangian methods for constrained optimization. SIAM (2014) – reference: Allen-Zhu, Z., Li, Y.: Neon2: finding local minima via first-order oracles. Adv. Neural Info. Process. Syst. 31, 3716–3726 (2018) – reference: Bertsekas, D.P.: Nonlinear Programming. Athena Scientific. (1995) – reference: KuczyńskiJWoźniakowskiHEstimating the largest eigenvalue by the power and Lanczos algorithms with a random startSIAM J. Matrix Anal. Appl.1992134109411221182715 – reference: Park, D., Kyrillidis, A., Carmanis, C., Sanghavi, S.: Non-square matrix sensing without spurious local minima via the burer-monteiro approach. In Artificial intelligence and statistics, pp. 65–74. PMLR, (2017) – reference: NesterovYPolyakBTCubic regularization of Newton method and its global performanceMath. Program.200610811772052229459 – reference: BirginEGMartínezJMThe use of quadratic regularization with a cubic descent condition for unconstrained optimizationSIAM J. Optim.2017272104910743659959 – reference: KuhlmannRBüskensCA primal-dual augmented Lagrangian penalty-interior-point filter line search algorithmMath. Method Oper. Res.20188734514833814593 – reference: WaltzRAMoralesJLNocedalJOrbanDAn interior algorithm for nonlinear optimization that combines line search and trust region stepsMath. Program.200610733914082221821 – reference: CartisCGouldNITointPLOn the evaluation complexity of constrained nonlinear least-squares and general constrained nonlinear optimization using second-order methodsSIAM J. Numer. Anal.20155328368513325759 – reference: ArmandPOmheniRA mixed logarithmic barrier-augmented Lagrangian method for nonlinear optimizationJ. Optim. Theory Appl.201717325235473634802 – reference: HaeserGLiuHYeYOptimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundaryMath. Program.201917812632994019951 – reference: RuszczynskiANonlinear optimization2011NJPrinceton University Press – reference: BianWChenXYeYComplexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimizationMath. Program.201514913013273300465 – reference: CarmonYDuchiJCHinderOSidfordAAccelerated methods for nonconvex optimizationSIAM J. Optim.2018282175117723814027 – reference: BirginEGGardenghiJMartínezJMSantosSATointPLEvaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order modelsSIAM J. Optim.20162629519673484405 – reference: CartisCGouldNITointPLDemetriouI PardalosPEvaluation complexity bounds for smooth constrained nonlinear optimization using scaled KKT conditions, high-order models and the criticality measure χ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\chi $$\end{document}Approximation and Optimization: Algorithms, Complexity and Applications2019NYSpringer526 – reference: M. F. Sahin, A. Eftekhari, A. Alacaoglu, F. Latorre, and V. Cevher. An inexact augmented Lagrangian framework for nonconvex optimization with nonlinear constraints. Adv. Neural Info. Process. Syst. 32, 632–650 (2019) – reference: ArmandPTranNNRapid infeasibility detection in a mixed logarithmic barrier-augmented Lagrangian method for nonlinear optimizationOptim. Methods Softw.201934599110134002763 – reference: Thanh, O. V., Gillis, N., Lecron, F.: Bounded simplex-structured matrix factorization. In ICASSP 2022-2022 IEEE International conference on acoustics, speech and signal processing (ICASSP), pp. 9062–9066. IEEE, (2022) – reference: KanzowCSteckDAn example comparing the standard and safeguarded augmented Lagrangian methodsOper. Res. Lett.20174565986033724190 – reference: ColemanTFLiuJYuanWA new trust-region algorithm for equality constrained optimizationComput. Optim. Appl.20022121771991883753 – reference: CurtisFERobinsonDPSamadiMA trust region algorithm with a worst-case iteration complexity of O(ϵ-3/2)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\cal{O} (\epsilon ^{-3/2})$$\end{document} for nonconvex optimizationMath. Program.201611621323612930 – reference: GrapigliaGNYuanY-XOn the complexity of an augmented Lagrangian method for nonconvex optimizationIMA J. Numer. Anal.2021412154615684246883 – reference: LuSRazaviyaynMYangBHuangKHongMFinding second-order stationary points efficiently in smooth nonconvex linearly constrained optimization problemsAdv. Neural Info. Process. Syst.20203328112822 – reference: AndreaniRHaeserGRamosASilvaPJA second-order sequential optimality condition associated to the convergence of optimization algorithmsIMA J. Numer. Anal.2017374190219293712179 – reference: BirginEGHaeserGRamosAAugmented Lagrangians with constrained subproblems and convergence to second-order stationary pointsComput. Optim. and Appl.201869151753742299 – reference: CartisC GouldNITointPLOptimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimizationJ. Complex.20195368943953087 – reference: RoyerCWWrightSJComplexity analysis of second-order line-search algorithms for smooth nonconvex optimizationSIAM J. Optim.2018282144814773799071 – reference: WächterABieglerLTOn the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programmingMath. Program.2006106125572195616 – reference: ConnARGouldNITointPLA globally convergent Lagrangian barrier algorithm for optimization with general inequality constraints and simple boundsMath. Comput.1997662172612881370850 – reference: GoldfarbDPolyakRScheinbergKYuzefovichIA modified barrier-augmented Lagrangian method for constrained minimizationComput. Optim. Appl.199914155741704946 – reference: ByrdRHSchnabelRBShultzGAA trust region algorithm for nonlinearly constrained optimizationSIAM J. Numer. Anal.198724511521170909071 – reference: HeCLuZPongTKA Newton-CG based augmented Lagrangian method for finding a second-order stationary point of nonconvex equality constrained optimization with complexity guaranteesSIAM J. Optim.2023333173417664622416 – reference: Xu, Y., Jin, R., Yang, Y.: NEON+: Accelerated gradient methods for extracting negative curvature for non-convex optimization. (2017) arXiv:1712.01033 – reference: ConnARGouldGTointPLLANCELOT: a Fortran package for large-scale nonlinear optimization2013NYSpringer Science & Business Media – reference: NesterovYNemirovskiiAInterior-point polynomial algorithms in convex programming1994PhiladelphiaSIAM – reference: XieYWrightSJComplexity of proximal augmented Lagrangian for nonconvex optimization with nonlinear equality constraintsJ. Sci. Comput.20218631304213011 – reference: LuZZhangYAn augmented Lagrangian approach for sparse principal component analysisMath. Program.201213511491932968253 – reference: F. Goyens, A. Eftekhari, and N. Boumal. Computing second-order points under equality constraints: revisiting Fletcher’s augmented Lagrangian. (2022) arXiv:2204.01448 – reference: MoguerzaJMPrietoFJAn augmented Lagrangian interior-point method using directions of negative curvatureMath. Program.20039535736161969766 – reference: C. Jin, P. Netrapalli, and M. I. Jordan. Accelerated gradient descent escapes saddle points faster than gradient descent. In: Conference on learning theory, pp. 1042–1085. (2018) – reference: Carmon,Y., Duchi, J.C., Hinder, O., Sidford, A.:“Convex until proven guilty": dimension-free acceleration of gradient descent on non-convex functions. In: International conference on machine learning, pp. 654–663. PMLR, (2017) – reference: CartisCGouldNITointPLOn the evaluation complexity of cubic regularization methods for potentially rank-deficient nonlinear least-squares problems and its relevance to constrained nonlinear optimizationSIAM J. Optim.2013233155315743084179 – reference: ZhaoXSunDTohKCA Newton-CG augmented Lagrangian method for semidefinite programmingSIAM J. Optim.2010204173717652600237 – reference: ChenXGuoLLuZYeJJAn augmented Lagrangian method for non-Lipschitz nonconvex programmingSIAM J. Numer. Anal.20175511681933600369 – reference: RoyerCWO’NeillMWrightSJA Newton-CG algorithm with complexity guarantees for smooth unconstrained optimizationMath. Program.202018014514884062843 – reference: De CarvalhoEPdos Santos JúniorAMaTFReduced gradient method combined with augmented Lagrangian and barrier for the optimal power flow problemAppl. Math. Comput.200820025295362426585 – reference: BonnansJFLaunayGSequential quadratic programming with penalization of the displacementSIAM J. Optim.1995547928121358805 – reference: HuckAGuillaumeMBlanc-TalonJMinimum dispersion constrained nonnegative matrix factorization to unmix hyperspectral dataIEEE Trans. Geosci. Remote Sens.201048625902602 – reference: LiuXXiaWWangBZhangLAn approach based on constrained nonnegative matrix factorization to unmix hyperspectral dataIEEE Trans. Geosci. Remote Sens.2010492757772 – reference: Bhojanapalli, S., Neyshabur, B., Srebro, N.: Global optimality of local search for low rank matrix recovery. Adv. Neural Info. Process. Syst. 29, 3873–3881 (2016) – reference: CartisCGouldNITointPLOn the complexity of finding first-order critical points in constrained nonlinear optimizationMath. Program.20141441931063179956 – reference: Xie, P., Wright, S.J.: Complexity of projected Newton methods for bound-constrained optimization. (2021) arXiv:2103.15989 – volume: 53 start-page: 68 year: 2019 ident: 603_CR27 publication-title: J. Complex. doi: 10.1016/j.jco.2018.11.001 – ident: 603_CR8 – volume: 57 start-page: 2479 issue: 7 year: 2009 ident: 603_CR68 publication-title: IEEE Trans. Signal Process. doi: 10.1109/TSP.2009.2016892 – ident: 603_CR38 – ident: 603_CR63 – volume: 114 start-page: 1 issue: 1 year: 2002 ident: 603_CR5 publication-title: J. Optim. Theory Appl doi: 10.1023/A:1015451203254 – ident: 603_CR30 – ident: 603_CR19 – volume: 135 start-page: 149 issue: 1 year: 2012 ident: 603_CR52 publication-title: Math. Program. doi: 10.1007/s10107-011-0452-4 – start-page: 5 volume-title: Approximation and Optimization: Algorithms, Complexity and Applications year: 2019 ident: 603_CR26 doi: 10.1007/978-3-030-12767-1_2 – ident: 603_CR13 doi: 10.1137/1.9781611973365 – volume: 127 start-page: 245 issue: 2 year: 2011 ident: 603_CR21 publication-title: Math. Program. doi: 10.1007/s10107-009-0286-5 – volume: 18 start-page: 1286 issue: 4 year: 2008 ident: 603_CR3 publication-title: SIAM J. Optim. doi: 10.1137/060654797 – volume: 14 start-page: 55 issue: 1 year: 1999 ident: 603_CR39 publication-title: Comput. Optim. Appl. doi: 10.1023/A:1008705028512 – volume: 13 start-page: 231 issue: 1 year: 1999 ident: 603_CR65 publication-title: Comput. Optim. Appl. doi: 10.1023/A:1008677427361 – volume: 13 start-page: 1094 issue: 4 year: 1992 ident: 603_CR48 publication-title: SIAM J. Matrix Anal. Appl. doi: 10.1137/0613066 – volume: 180 start-page: 451 issue: 1 year: 2020 ident: 603_CR60 publication-title: Math. Program. doi: 10.1007/s10107-019-01362-7 – volume: 27 start-page: 1049 issue: 2 year: 2017 ident: 603_CR14 publication-title: SIAM J. Optim. doi: 10.1137/16M110280X – volume: 107 start-page: 391 issue: 3 year: 2006 ident: 603_CR67 publication-title: Math. Program. doi: 10.1007/s10107-004-0560-5 – volume: 26 start-page: 951 issue: 2 year: 2016 ident: 603_CR11 publication-title: SIAM J. Optim. doi: 10.1137/15M1031631 – volume: 53 start-page: 836 issue: 2 year: 2015 ident: 603_CR25 publication-title: SIAM J. Numer. Anal. doi: 10.1137/130915546 – ident: 603_CR1 doi: 10.1145/3055399.3055464 – volume: 108 start-page: 177 issue: 1 year: 2006 ident: 603_CR57 publication-title: Math. Program. doi: 10.1007/s10107-006-0706-8 – volume: 173 start-page: 523 issue: 2 year: 2017 ident: 603_CR6 publication-title: J. Optim. Theory Appl. doi: 10.1007/s10957-017-1071-x – ident: 603_CR9 – volume: 30 start-page: 80 issue: 1 year: 2020 ident: 603_CR16 publication-title: SIAM J. Optim. doi: 10.1137/18M1216146 – volume: 55 start-page: 168 issue: 1 year: 2017 ident: 603_CR28 publication-title: SIAM J. Numer. Anal. doi: 10.1137/15M1052834 – volume: 68 start-page: 367 issue: 2 year: 2017 ident: 603_CR53 publication-title: J. Glob. Optim. doi: 10.1007/s10898-016-0475-8 – volume: 5 start-page: 792 issue: 4 year: 1995 ident: 603_CR15 publication-title: SIAM J. Optim. doi: 10.1137/0805038 – volume: 49 start-page: 757 issue: 2 year: 2010 ident: 603_CR50 publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2010.2068053 – volume: 66 start-page: 261 issue: 217 year: 1997 ident: 603_CR33 publication-title: Math. Comput. doi: 10.1090/S0025-5718-97-00777-1 – volume: 149 start-page: 301 issue: 1 year: 2015 ident: 603_CR10 publication-title: Math. Program. doi: 10.1007/s10107-014-0753-5 – volume: 87 start-page: 451 issue: 3 year: 2018 ident: 603_CR49 publication-title: Math. Method Oper. Res. doi: 10.1007/s00186-017-0625-x – volume: 33 start-page: 1734 issue: 3 year: 2023 ident: 603_CR44 publication-title: SIAM J. Optim. doi: 10.1137/22M1489824 – volume: 28 start-page: 1448 issue: 2 year: 2018 ident: 603_CR61 publication-title: SIAM J. Optim. doi: 10.1137/17M1134329 – volume: 21 start-page: 177 issue: 2 year: 2002 ident: 603_CR31 publication-title: Comput. Optim. Appl. doi: 10.1023/A:1013764800871 – volume: 178 start-page: 263 issue: 1 year: 2019 ident: 603_CR42 publication-title: Math. Program. doi: 10.1007/s10107-018-1290-4 – ident: 603_CR69 – ident: 603_CR71 – volume: 33 start-page: 2811 year: 2020 ident: 603_CR51 publication-title: Adv. Neural Info. Process. Syst. – ident: 603_CR40 – volume: 28 start-page: 1751 issue: 2 year: 2018 ident: 603_CR20 publication-title: SIAM J. Optim. doi: 10.1137/17M1114296 – volume: 45 start-page: 765 issue: 3 year: 2007 ident: 603_CR54 publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2006.888466 – ident: 603_CR59 – volume: 33 start-page: 1191 issue: 2 year: 2023 ident: 603_CR43 publication-title: SIAM J. Optim. doi: 10.1137/21M1457011 – ident: 603_CR2 – volume: 28 start-page: 1533 issue: 2 year: 2018 ident: 603_CR36 publication-title: SIAM J. Optim. doi: 10.1137/16M1108650 – volume: 41 start-page: 1546 issue: 2 year: 2021 ident: 603_CR41 publication-title: IMA J. Numer. Anal. doi: 10.1093/imanum/draa021 – volume: 34 start-page: 991 issue: 5 year: 2019 ident: 603_CR7 publication-title: Optim. Methods Softw. doi: 10.1080/10556788.2018.1528250 – volume: 69 start-page: 51 issue: 1 year: 2018 ident: 603_CR12 publication-title: Comput. Optim. and Appl. doi: 10.1007/s10589-017-9937-2 – volume: 37 start-page: 1902 issue: 4 year: 2017 ident: 603_CR4 publication-title: IMA J. Numer. Anal. doi: 10.1093/imanum/drx011 – volume: 31 start-page: 518 issue: 1 year: 2021 ident: 603_CR34 publication-title: SIAM J. Optim. doi: 10.1137/19M130563X – volume-title: Nonlinear optimization year: 2011 ident: 603_CR62 doi: 10.2307/j.ctvcm4hcj – volume: 29 start-page: 2146 issue: 3 year: 2019 ident: 603_CR18 publication-title: SIAM J. Optim. doi: 10.1137/17M1113898 – volume: 41 start-page: 84 issue: 1 year: 2021 ident: 603_CR58 publication-title: IMA J. Numer. Anal. doi: 10.1093/imanum/drz074 – volume: 86 start-page: 1 issue: 3 year: 2021 ident: 603_CR70 publication-title: J. Sci. Comput. – volume: 7 start-page: 331 issue: 3 year: 2015 ident: 603_CR72 publication-title: Math. Program. Comput. doi: 10.1007/s12532-015-0082-6 – volume: 24 start-page: 1152 issue: 5 year: 1987 ident: 603_CR17 publication-title: SIAM J. Numer. Anal. doi: 10.1137/0724076 – volume: 67 start-page: 5239 issue: 20 year: 2019 ident: 603_CR29 publication-title: IEEE Trans. Signal Process. doi: 10.1109/TSP.2019.2937282 – volume: 1 start-page: 1 issue: 162 year: 2016 ident: 603_CR35 publication-title: Math. Program. – volume: 200 start-page: 529 issue: 2 year: 2008 ident: 603_CR37 publication-title: Appl. Math. Comput. – volume: 23 start-page: 1553 issue: 3 year: 2013 ident: 603_CR22 publication-title: SIAM J. Optim. doi: 10.1137/120869687 – volume: 48 start-page: 2590 issue: 6 year: 2010 ident: 603_CR45 publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2009.2038483 – ident: 603_CR64 doi: 10.1109/ICASSP43922.2022.9747124 – volume: 144 start-page: 93 issue: 1 year: 2014 ident: 603_CR23 publication-title: Math. Program. doi: 10.1007/s10107-012-0617-9 – volume: 95 start-page: 573 issue: 3 year: 2003 ident: 603_CR55 publication-title: Math. Program. doi: 10.1007/s10107-002-0360-8 – volume: 144 start-page: 93 issue: 1 year: 2014 ident: 603_CR24 publication-title: Math. Program. doi: 10.1007/s10107-012-0617-9 – volume-title: LANCELOT: a Fortran package for large-scale nonlinear optimization year: 2013 ident: 603_CR32 – ident: 603_CR46 – volume: 20 start-page: 1737 issue: 4 year: 2010 ident: 603_CR73 publication-title: SIAM J. Optim. doi: 10.1137/080718206 – volume: 106 start-page: 25 issue: 1 year: 2006 ident: 603_CR66 publication-title: Math. Program. doi: 10.1007/s10107-004-0559-y – volume: 45 start-page: 598 issue: 6 year: 2017 ident: 603_CR47 publication-title: Oper. Res. Lett. doi: 10.1016/j.orl.2017.09.005 – volume-title: Interior-point polynomial algorithms in convex programming year: 1994 ident: 603_CR56 doi: 10.1137/1.9781611970791 |
SSID | ssj0009732 |
Score | 2.3958466 |
Snippet | In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice... |
SourceID | swepub crossref springer |
SourceType | Open Access Repository Index Database Publisher |
StartPage | 843 |
SubjectTerms | Convex and Discrete Geometry Management Science Mathematics Mathematics and Statistics Operations Research Operations Research/Decision Theory Optimization Statistics |
Title | A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization |
URI | https://link.springer.com/article/10.1007/s10589-024-00603-6 https://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-207432 |
Volume | 89 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEA7aXupBtCrWR8lBvGhgN82-jtvaB2p7slIvhmySLgVtpQ_w5zvJ7rZVRPCyYWF3AjOTzBcy8w1CVxLWTKRkg0SOEISpsSZiHETEC6WC-KAg6Jkb3f7A7w3Z_cgb5UVhiyLbvbiStDv1VrGbZ9J7KCOGRKRB_F1U9szZHbx4SOMN1W5g25I5EfUJhPNGXirzu4zv4aiY-AdvqI01nQO0n4NEHGdWPUQ7elpFe1vUgfDWX_OtLqqoYjBjRrl8hF5jDFuXaQ3c6mITpRQ856YzHRGr1JJwKvwoUohSKTgHzppIY0CvOM1IqPF0NrXp6J9YGupcPION5T2v2DxGw077qdUjeRsFImnoLIkOwBqCAVRhrtRMuGGovagRKQcM6CUwBkmQaOUqVyjp-iFNwkQ4noaTkDcGxHKCSjCtPkVY0YgmvsOEDgF5BUnINMA9JkGOhL9lDd0U2uQfGVsG3_AiG91z0D23uud-Dd0WCuf5yln8-fl1ZpS1aEONfTd5jvlsnvK3yYpTA4jo2f_knqMKNR5hc1QuUGk5X-lLQBrLpI7KcafZHJix-_LQrltH-wK2uM8u |
linkProvider | Springer Nature |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NS8MwFA86D86D6FScnzmIFw20adqmxzKdU7edNtnJkCZZGegm-wD_fF_6sU0RwUtLoX2B95L8fiXv_R5CVwrWTKSVRyJHSsL00BA5DCPic6UBHzSAnj3R7XSDVp89DfxBURQ2K7PdyyPJbKdeK3bzbXoPZcSKiHgk2ERbQAa4TeTq03gltRtmbcmciAYE4NwrSmV-t_EdjsqBf-iGZljT3EO7BUnEcR7VfbRhxjW0syYdCE-dpd7qrIaqljPmkssH6DXGsHXZ1sCNB2xRSsN1ajvTEblIMxFOjdsyBZRKYXLgvIk0BvaK01yEGo8n4ywd_RMrK52LJ7CxvBcVm4eo37zvNVqkaKNAFOXOnJgQoiEZUBXmKsOky7nxIy_SDgTQT-AeJmFitKtdqZUbcJrwRDq-gT8hfwiM5QhVYFhzjLCmEU0Ch0nDgXmFCWcG6B5TYEfB16qObkpvio9cLUOsdJGt7wX4XmS-F0Ed3ZYOF8XKmf35-nUelKVpK419N3qJxWSairfRQlBLiOjJ_-xeou1Wr9MW7cfu8ymqUjs7snyVM1SZTxfmHFjHPLnIJtkXo9HPEQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8NAEF60gtSDaFWszz2IF12abDavY2mtVdviwUpPLpvdTShoWvoAf76zSfpQRPCSEEgmYWZ25gs78w1CVxLWTKikQ0JLCMJUrImI_ZC4gVSQHxQkPbOj2-157T57HLiDtS7-rNp9sSWZ9zQYlqZ0VhuruLbW-OaaUh_KiCEUcYi3ibYgHNvGr_u0vqLd9bMRZVZIPQKp3SnaZn6X8T01LT7iB4dolndae2i3AIy4nlt4H23otIJ21mgE4aq75F6dVlDZ4MecfvkAvdUxhDEzJrhxj03GUnCcmCl1RMyTjJBT4Y5IIGMl4Cg4HyiNAcniJCekxukozUrTP7E0NLp4BEHmo-jePET91t1Lo02KkQpE0sCaEe2DZQQD2MJsqZmwg0C7oRMqC4zpRnD2Iz_Syla2UNL2AhoFkbBcDX9Fbgzo5QiV4LX6GGFFQxp5FhM6ABTmRwHTAP2YBDkSnpZVdLPQJh_nzBl8xZFsdM9B9zzTPfeq6HahcF6soumft1_nRlmKNjTZzeFrnY8mCX8fzjk14Iie_E_uJdp-brZ456H3dIrK1DhHVrpyhkqzyVyfAwCZRReZj30Bu2_TTQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Newton-CG+based+barrier-augmented+Lagrangian+method+for+general+nonconvex+conic+optimization&rft.jtitle=Computational+optimization+and+applications&rft.au=He%2C+Chuan&rft.au=Huang%2C+Heng&rft.au=Lu%2C+Zhaosong&rft.date=2024-12-01&rft.pub=Springer+US&rft.issn=0926-6003&rft.eissn=1573-2894&rft.volume=89&rft.issue=3&rft.spage=843&rft.epage=894&rft_id=info:doi/10.1007%2Fs10589-024-00603-6&rft.externalDocID=10_1007_s10589_024_00603_6 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0926-6003&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0926-6003&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0926-6003&client=summon |