Approximate Dynamic Programming for Selective Maintenance in Series-Parallel Systems
The problem of allocating limited resources to maintain components of a multicomponent system, known as selective maintenance, is naturally formulated as a high-dimensional Markov decision process (MDP). Unfortunately, these problems are difficult to solve exactly for realistically sized systems. Wi...
Saved in:
Published in | IEEE transactions on reliability Vol. 69; no. 3; pp. 1147 - 1164 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.09.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The problem of allocating limited resources to maintain components of a multicomponent system, known as selective maintenance, is naturally formulated as a high-dimensional Markov decision process (MDP). Unfortunately, these problems are difficult to solve exactly for realistically sized systems. With this motivation, we contribute an approximate dynamic programming (ADP) algorithm for solving the selective maintenance problem for a series-parallel system with binary-state components. To the best of our knowledge, this paper describes the first application of ADP to maintain multicomponent systems. Our ADP is compared, using a numerical example from the literature, against exact solutions to the corresponding MDP. We then summarize the results of a more comprehensive set of experiments that demonstrate the ADP's favorable performance on larger instances in comparison to both the exact (but computationally intensive) MDP approach and the heuristic (but computationally faster) one-step-lookahead approach. Finally, we demonstrate that the ADP is capable of solving an extension of the basic selective maintenance problem in which maintenance resources are permitted to be shared across stages. |
---|---|
ISSN: | 0018-9529 1558-1721 |
DOI: | 10.1109/TR.2019.2916898 |