BETTER INPUT MODELING VIA MODEL AVERAGING
Rather than the standard practice of selecting a single "best-fit" distribution from a candidate set, frequentist model averaging (FMA) forms a mixture distribution that is a weighted average of the candidate distributions with the weights tuned by cross-validation. In previous work we sho...
Saved in:
Published in | 2018 Winter Simulation Conference (WSC) pp. 1575 - 1586 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.12.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Rather than the standard practice of selecting a single "best-fit" distribution from a candidate set, frequentist model averaging (FMA) forms a mixture distribution that is a weighted average of the candidate distributions with the weights tuned by cross-validation. In previous work we showed theoretically and empirically that FMA in the probability space leads to higher fidelity input distributions. In this paper we show that FMA can also be implemented in the quantile space, leading to fits that emphasize tail behavior. We also describe an R package for FMA that is easy to use and available for download. |
---|---|
ISSN: | 1558-4305 |
DOI: | 10.1109/WSC.2018.8632239 |