Planning treatment of ischemic heart disease with partially observable Markov decision processes

被引:90
|
作者
Hauskrecht, M
Fraser, H
机构
[1] Brown Univ, Dept Comp Sci, Providence, RI 02912 USA
[2] Tufts Univ, New England Med Ctr, Boston, MA 02111 USA
关键词
dynamic decision making; partially observable Markov decision process; medical therapy planning; ischemic heart disease;
D O I
10.1016/S0933-3657(99)00042-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead, they are very often dependent and interleaved over time. This is mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of partially observable Markov decision processes (POMDPs) developed and used in the operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In this paper, we show how the POMDP framework can be used to model and solve the problem of the management of patients with ischemic heart disease (IHD), and demonstrate the modeling advantages of the framework over standard decision formalisms. (C) 2000 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:221 / 244
页数:24
相关论文
共 50 条
  • [1] Modeling treatment of ischemic heart disease with partially observable Markov decision processes.
    Hauskrecht, M
    Fraser, H
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 1998, : 538 - 542
  • [2] The partially observable Markov decision processes in healthcare: an application to patients with ischemic heart disease (IHD)
    Vozikis, A.
    Goulionis, J. E.
    Benos, V. K.
    OPERATIONAL RESEARCH, 2012, 12 (01) : 3 - 14
  • [3] The partially observable Markov decision processes in healthcare: an application to patients with ischemic heart disease (IHD)
    A. Vozikis
    J. E. Goulionis
    V. K. Benos
    Operational Research, 2012, 12 : 3 - 14
  • [4] A Bayesian Approach for Learning and Planning in Partially Observable Markov Decision Processes
    Ross, Stephane
    Pineau, Joelle
    Chaib-draa, Brahim
    Kreitmann, Pierre
    JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 1729 - 1770
  • [5] Policy Reuse for Learning and Planning in Partially Observable Markov Decision Processes
    Wu, Bo
    Feng, Yanpeng
    2017 4TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING (ICISCE), 2017, : 549 - 552
  • [6] Partially Observable Markov Decision Processes and Robotics
    Kurniawati, Hanna
    ANNUAL REVIEW OF CONTROL ROBOTICS AND AUTONOMOUS SYSTEMS, 2022, 5 : 253 - 277
  • [7] A tutorial on partially observable Markov decision processes
    Littman, Michael L.
    JOURNAL OF MATHEMATICAL PSYCHOLOGY, 2009, 53 (03) : 119 - 125
  • [8] Quantum partially observable Markov decision processes
    Barry, Jennifer
    Barry, Daniel T.
    Aaronson, Scott
    PHYSICAL REVIEW A, 2014, 90 (03):
  • [9] PARTIALLY OBSERVABLE MARKOV DECISION PROCESSES WITH PARTIALLY OBSERVABLE RANDOM DISCOUNT FACTORS
    Martinez-Garcia, E. Everardo
    Minjarez-Sosa, J. Adolfo
    Vega-Amaya, Oscar
    KYBERNETIKA, 2022, 58 (06) : 960 - 983
  • [10] Searching for Cerebrovascular Disease Optimal Treatment Recommendations Applying Partially Observable Markov Decision Processes
    Victorio-Meza, Hermilo
    Mejia-Lavalle, Manuel
    Martinez Rebollar, Alicia
    Blanco Ortega, Andres
    Pichardo Lagunas, Obdulia
    Sidorov, Grigori
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2018, 32 (01)