Acceleration Meets Inverse Maintenance: Faster $\ell_{\infty}$-Regression
We propose a randomized multiplicative weight update (MWU) algorithm for $\ell_{\infty}$ regression that runs in $\widetilde{O}\left(n^{2+1/22.5} \text{poly}(1/\epsilon)\right)$ time when $\omega = 2+o(1)$, improving upon the previous best $\widetilde{O}\left(n^{2+1/18} \text{poly} \log(1/\epsilon)\...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
30.09.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We propose a randomized multiplicative weight update (MWU) algorithm for
$\ell_{\infty}$ regression that runs in $\widetilde{O}\left(n^{2+1/22.5}
\text{poly}(1/\epsilon)\right)$ time when $\omega = 2+o(1)$, improving upon the
previous best $\widetilde{O}\left(n^{2+1/18} \text{poly}
\log(1/\epsilon)\right)$ runtime in the low-accuracy regime. Our algorithm
combines state-of-the-art inverse maintenance data structures with
acceleration. In order to do so, we propose a novel acceleration scheme for MWU
that exhibits {\it stabiliy} and {\it robustness}, which are required for the
efficient implementations of the inverse maintenance data structures.
We also design a faster {\it deterministic} MWU algorithm that runs in
$\widetilde{O}\left(n^{2+1/12}\text{poly}(1/\epsilon)\right))$ time when
$\omega = 2+o(1)$, improving upon the previous best
$\widetilde{O}\left(n^{2+1/6} \text{poly} \log(1/\epsilon)\right)$ runtime in
the low-accuracy regime. We achieve this by showing a novel stability result
that goes beyond the previous known works based on interior point methods
(IPMs).
Our work is the first to use acceleration and inverse maintenance together
efficiently, finally making the two most important building blocks of modern
structured convex optimization compatible. |
---|---|
DOI: | 10.48550/arxiv.2409.20030 |