Acceleration Meets Inverse Maintenance: Faster \(\ell_{\infty}\)-Regression

We propose a randomized multiplicative weight update (MWU) algorithm for \(\ell_{\infty}\) regression that runs in \(\widetilde{O}\left(n^{2+1/22.5} \text{poly}(1/\epsilon)\right)\) time when \(\omega = 2+o(1)\), improving upon the previous best \(\widetilde{O}\left(n^{2+1/18} \text{poly} \log(1/\ep...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Adil, Deeksha, Jiang, Shunhua, Kyng, Rasmus
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 30.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We propose a randomized multiplicative weight update (MWU) algorithm for \(\ell_{\infty}\) regression that runs in \(\widetilde{O}\left(n^{2+1/22.5} \text{poly}(1/\epsilon)\right)\) time when \(\omega = 2+o(1)\), improving upon the previous best \(\widetilde{O}\left(n^{2+1/18} \text{poly} \log(1/\epsilon)\right)\) runtime in the low-accuracy regime. Our algorithm combines state-of-the-art inverse maintenance data structures with acceleration. In order to do so, we propose a novel acceleration scheme for MWU that exhibits {\it stabiliy} and {\it robustness}, which are required for the efficient implementations of the inverse maintenance data structures. We also design a faster {\it deterministic} MWU algorithm that runs in \(\widetilde{O}\left(n^{2+1/12}\text{poly}(1/\epsilon)\right))\) time when \(\omega = 2+o(1)\), improving upon the previous best \(\widetilde{O}\left(n^{2+1/6} \text{poly} \log(1/\epsilon)\right)\) runtime in the low-accuracy regime. We achieve this by showing a novel stability result that goes beyond the previous known works based on interior point methods (IPMs). Our work is the first to use acceleration and inverse maintenance together efficiently, finally making the two most important building blocks of modern structured convex optimization compatible.
ISSN:2331-8422