Regression analysis: likelihood, error and entropy

In a regression with independent and identically distributed normal residuals, the log-likelihood function yields an empirical form of the L 2 -norm, whereas the normal distribution can be obtained as a solution of differential entropy maximization subject to a constraint on the L 2 -norm of a rando...

Full description

Saved in:
Bibliographic Details
Published inMathematical programming Vol. 174; no. 1-2; pp. 145 - 166
Main Authors Grechuk, Bogdan, Zabarankin, Michael
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.03.2019
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In a regression with independent and identically distributed normal residuals, the log-likelihood function yields an empirical form of the L 2 -norm, whereas the normal distribution can be obtained as a solution of differential entropy maximization subject to a constraint on the L 2 -norm of a random variable. The L 1 -norm and the double exponential (Laplace) distribution are related in a similar way. These are examples of an “inter-regenerative” relationship. In fact, L 2 -norm and L 1 -norm are just particular cases of general error measures introduced by Rockafellar et al. (Finance Stoch 10(1):51–74, 2006 ) on a space of random variables. General error measures are not necessarily symmetric with respect to ups and downs of a random variable, which is a desired property in finance applications where gains and losses should be treated differently. This work identifies a set of all error measures, denoted by E , and a set of all probability density functions (PDFs) that form “inter-regenerative” relationships (through log-likelihood and entropy maximization). It also shows that M -estimators, which arise in robust regression but, in general, are not error measures, form “inter-regenerative” relationships with all PDFs. In fact, the set of M -estimators, which are error measures, coincides with E . On the other hand, M -estimators are a particular case of L -estimators that also arise in robust regression. A set of L -estimators which are error measures is identified—it contains E and the so-called trimmed L p -norms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0025-5610
1436-4646
DOI:10.1007/s10107-018-1256-6