Bridging POMDPs and Bayesian decision making for robust maintenance planning under model uncertainty: An application to railway systems

被引:11
作者
Arcieri, Giacomo [1 ]
Hoelzl, Cyprien [1 ]
Schwery, Oliver [2 ]
Straub, Daniel [3 ]
Papakonstantinou, Konstantinos G. [4 ]
Chatzi, Eleni [1 ]
机构
[1] Swiss Fed Inst Technol, Inst Struct Engn, CH-8093 Zurich, Switzerland
[2] Swiss Fed Railways SBB, CH-3000 Bern, Switzerland
[3] Tech Univ Munich, Engn Risk Anal Grp, D-80333 Munich, Germany
[4] Penn State Univ, Dept Civil & Environm Engn, University Pk, PA 16802 USA
关键词
Partially observable Markov decision processes; Bayesian inference; Optimal maintenance planning; Model uncertainty; Hidden Markov models; Dynamic Programming; STRUCTURAL INSPECTION; MARKOV; POLICIES;
D O I
10.1016/j.ress.2023.109496
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Structural Health Monitoring (SHM) describes a process for inferring quantifiable metrics of structural condition, which can serve as input to support decisions on the operation and maintenance of infrastructure assets. Given the long lifespan of critical structures, this problem can be cast as a sequential decision making problem over prescribed horizons. Partially Observable Markov Decision Processes (POMDPs) offer a formal framework to solve the underlying optimal planning task. However, two issues can undermine the POMDP solutions. Firstly, the need for a model that can adequately describe the evolution of the structural condition under deterioration or corrective actions and, secondly, the non-trivial task of recovery of the observation process parameters from available monitoring data. Despite these potential challenges, the adopted POMDP models do not typically account for uncertainty on model parameters, leading to solutions which can be unrealistically confident. In this work, we address both key issues. We present a framework to estimate POMDP transition and observation model parameters directly from available data, via Markov Chain Monte Carlo (MCMC) sampling of a Hidden Markov Model (HMM) conditioned on actions. The MCMC inference estimates distributions of the involved model parameters. We then form and solve the POMDP problem by exploiting the inferred distributions, to derive solutions that are robust to model uncertainty. We successfully apply our approach on maintenance planning for railway track assets on the basis of a "fractal value"indicator, which is computed from actual railway monitoring data.
引用
收藏
页数:17
相关论文
共 54 条
  • [1] Markov-switching autoregressive models for wind time series
    Ailliot, Pierre
    Monbet, Valerie
    [J]. ENVIRONMENTAL MODELLING & SOFTWARE, 2012, 30 : 92 - 101
  • [2] Deep reinforcement learning driven inspection and maintenance planning under incomplete information and constraints
    Andriotis, C. P.
    Papakonstantinou, K. G.
    [J]. RELIABILITY ENGINEERING & SYSTEM SAFETY, 2021, 212
  • [3] Managing engineering systems with large state and action spaces through deep reinforcement learning
    Andriotis, C. P.
    Papakonstantinou, K. G.
    [J]. RELIABILITY ENGINEERING & SYSTEM SAFETY, 2019, 191
  • [4] Value of structural health information in partially observable stochastic environments
    Andriotis, Charalampos P.
    Papakonstantinou, Konstantinos G.
    Chatzi, Eleni N.
    [J]. STRUCTURAL SAFETY, 2021, 93 (93)
  • [5] [Anonymous], 2009, Probabilistic graphical models: principles and techniques, DOI DOI 10.5555/1795555
  • [6] Arcieri G, 2022, Arxiv, DOI arXiv:2110.13079
  • [7] The effects of tamping on railway track geometry degradation
    Audley, M.
    Andrews, J. D.
    [J]. PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART F-JOURNAL OF RAIL AND RAPID TRANSIT, 2013, 227 (04) : 376 - 391
  • [8] DYNAMIC PROGRAMMING
    BELLMAN, R
    [J]. SCIENCE, 1966, 153 (3731) : 34 - &
  • [9] Berger J. O., 1985, Statistical Decision Theory and Bayesian Analysis, V2
  • [10] Bertsekas D. P., 2005, Dynamic Programming and Optimal Control