BETTER INPUT MODELING VIA MODEL AVERAGING

Rather than the standard practice of selecting a single "best-fit" distribution from a candidate set, frequentist model averaging (FMA) forms a mixture distribution that is a weighted average of the candidate distributions with the weights tuned by cross-validation. In previous work we sho...

Full description

Saved in:
Bibliographic Details
Published in2018 Winter Simulation Conference (WSC) pp. 1575 - 1586
Main Authors Jiang, Wendy Xi, Nelson, Barry L.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.12.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Rather than the standard practice of selecting a single "best-fit" distribution from a candidate set, frequentist model averaging (FMA) forms a mixture distribution that is a weighted average of the candidate distributions with the weights tuned by cross-validation. In previous work we showed theoretically and empirically that FMA in the probability space leads to higher fidelity input distributions. In this paper we show that FMA can also be implemented in the quantile space, leading to fits that emphasize tail behavior. We also describe an R package for FMA that is easy to use and available for download.
ISSN:1558-4305
DOI:10.1109/WSC.2018.8632239