Superquantiles at Work: Machine Learning Applications and Efficient Subgradient Computation
R. Tyrell Rockafellar and collaborators introduced, in a series of works, new regression modeling methods based on the notion of superquantile (or conditional value-at-risk). These methods have been influential in economics, finance, management science, and operations research in general. Recently,...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
03.01.2022
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.2201.00508 |
Cover
Loading…
Summary: | R. Tyrell Rockafellar and collaborators introduced, in a series of works, new
regression modeling methods based on the notion of superquantile (or
conditional value-at-risk). These methods have been influential in economics,
finance, management science, and operations research in general. Recently, they
have been the subject of a renewed interest in machine learning, to address
issues of distributional robustness and fair allocation. In this paper, we
review some of these new applications of the superquantile, with references to
recent developments. These applications involve nonsmooth superquantile-based
objective functions that admit explicit subgradient calculations. To make these
superquantile-based functions amenable to the gradient-based algorithms popular
in machine learning, we show how to smooth them by infimal convolution and
describe numerical procedures to compute the gradients of the smooth
approximations. We put the approach into perspective by comparing it to other
smoothing techniques and by illustrating it on toy examples. |
---|---|
DOI: | 10.48550/arxiv.2201.00508 |