Null space gradient flows for constrained optimization with applications to shape optimization
The purpose of this article is to introduce a gradient-flow algorithm for solving equality and inequality constrained optimization problems, which is particularly suited for shape optimization applications. We rely on a variant of the Ordinary Differential Equation (ODE) approach proposed by Yamashi...
Saved in:
Published in | ESAIM. Control, optimisation and calculus of variations Vol. 26; p. 90 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Les Ulis
EDP Sciences
2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The purpose of this article is to introduce a gradient-flow algorithm for solving equality and inequality constrained optimization problems, which is particularly suited for shape optimization applications. We rely on a variant of the Ordinary Differential Equation (ODE) approach proposed by Yamashita (Math. Program. 18 (1980) 155–168) for equality constrained problems: the search direction is a combination of a null space step and a range space step, aiming to decrease the value of the minimized objective function and the violation of the constraints, respectively. Our first contribution is to propose an extension of this ODE approach to optimization problems featuring both equality and inequality constraints. In the literature, a common practice consists in reducing inequality constraints to equality constraints by the introduction of additional slack variables. Here, we rather solve their local combinatorial character by computing the projection of the gradient of the objective function onto the cone of feasible directions. This is achieved by solving a dual quadratic programming subproblem whose size equals the number of active or violated constraints. The solution to this problem allows to identify the inequality constraints to which the optimization trajectory should remain tangent. Our second contribution is a formulation of our gradient flow in the context of – infinite-dimensional – Hilbert spaces, and of even more general optimization sets such as sets of shapes, as it occurs in shape optimization within the framework of Hadamard’s boundary variation method. The cornerstone of this formulation is the classical operation of extension and regularization of shape derivatives. The numerical efficiency and ease of implementation of our algorithm are demonstrated on realistic shape optimization problems. |
---|---|
ISSN: | 1292-8119 1262-3377 |
DOI: | 10.1051/cocv/2020015 |