Random Multi-Constraint Projection: Stochastic Gradient Methods for Convex Optimization with Many Constraints
Consider convex optimization problems subject to a large number of constraints. We focus on stochastic problems in which the objective takes the form of expected values and the feasible set is the intersection of a large number of convex sets. We propose a class of algorithms that perform both stoch...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
11.11.2015
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.1511.03760 |
Cover
Loading…
Summary: | Consider convex optimization problems subject to a large number of
constraints. We focus on stochastic problems in which the objective takes the
form of expected values and the feasible set is the intersection of a large
number of convex sets. We propose a class of algorithms that perform both
stochastic gradient descent and random feasibility updates simultaneously. At
every iteration, the algorithms sample a number of projection points onto a
randomly selected small subsets of all constraints. Three feasibility update
schemes are considered: averaging over random projected points, projecting onto
the most distant sample, projecting onto a special polyhedral set constructed
based on sample points. We prove the almost sure convergence of these
algorithms, and analyze the iterates' feasibility error and optimality error,
respectively. We provide new convergence rate benchmarks for stochastic
first-order optimization with many constraints. The rate analysis and numerical
experiments reveal that the algorithm using the polyhedral-set projection
scheme is the most efficient one within known algorithms. |
---|---|
DOI: | 10.48550/arxiv.1511.03760 |