Superquantiles at Work: Machine Learning Applications and Efficient Subgradient Computation
R. Tyrell Rockafellar and his collaborators introduced, in a series of works, new regression modeling methods based on the notion of superquantile (or conditional value-at-risk). These methods have been influential in economics, finance, management science, and operations research in general. Recent...
Saved in:
Published in | Set-valued and variational analysis Vol. 29; no. 4; pp. 967 - 996 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Dordrecht
Springer Netherlands
01.12.2021
Springer Nature B.V Springer |
Subjects | |
Online Access | Get full text |
ISSN | 1877-0533 1877-0541 |
DOI | 10.1007/s11228-021-00609-w |
Cover
Summary: | R. Tyrell Rockafellar and his collaborators introduced, in a series of works, new regression modeling methods based on the notion of superquantile (or conditional value-at-risk). These methods have been influential in economics, finance, management science, and operations research in general. Recently, they have been subject of a renewed interest in machine learning, to address issues of distributional robustness and fair allocation. In this paper, we review some of these new applications of the superquantile, with references to recent developments. These applications involve nonsmooth superquantile-based objective functions that admit explicit subgradient calculations. To make these superquantile-based functions amenable to the gradient-based algorithms popular in machine learning, we show how to smooth them by infimal convolution and detail numerical procedures to compute the gradients of the smooth approximations. We put the approach into perspective by comparing it to other smoothing techniques and by illustrating it on toy examples. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1877-0533 1877-0541 |
DOI: | 10.1007/s11228-021-00609-w |