Convergence Rates of Finite-Difference Sensitivity Estimates for Stochastic Systems

A mean square error analysis of finite-difference sensitivity estimators for stochastic systems is presented and an expression for the optimal size of the increment is derived. The asymptotic behavior of the optimal increments, and the behavior of the corresponding optimal finite-difference (FD) est...

Full description

Saved in:
Bibliographic Details
Published inOperations research Vol. 41; no. 4; pp. 694 - 703
Main Authors Zazanis, Michael A, Suri, Rajan
Format Journal Article
LanguageEnglish
Published Linthicum, MD INFORMS 01.07.1993
Operations Research Society of America
Institute for Operations Research and the Management Sciences
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A mean square error analysis of finite-difference sensitivity estimators for stochastic systems is presented and an expression for the optimal size of the increment is derived. The asymptotic behavior of the optimal increments, and the behavior of the corresponding optimal finite-difference (FD) estimators are investigated for finite-horizon experiments. Steady-state estimation is also considered for regenerative systems and in this context a convergence analysis of ratio estimators is presented. The use of variance reduction techniques for these FD estimates, such as common random numbers in simulation experiments, is not considered here. In the case here, direct gradient estimation techniques (such as perturbation analysis and likelihood ratio methods) whenever applicable, are shown to converge asymptotically faster than the optimal FD estimators.
ISSN:0030-364X
1526-5463
DOI:10.1287/opre.41.4.694