Planning for Target System Striking Based on Markov Decision Process

被引:0
|
作者
Lei Ting [1 ]
Zhu Cheng [1 ]
Zhang Weiming [1 ]
机构
[1] Natl Univ Def Technol, Sci & Technol Informat Syst Engn Lab, Changsha, Hunan, Peoples R China
来源
2013 IEEE INTERNATIONAL CONFERENCE ON SERVICE OPERATIONS AND LOGISTICS, AND INFORMATICS (SOLI) | 2013年
关键词
Targe System; Markov Decision Process; Recovering mechanism;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Planning for targets selection and striking is an important part during the military decision process. In this paper, as to optimize the striking process against the military target system, in which there are relationships between different targets and each target having the capability to recover from the damage, the structure and the recovering mechanism of the target system is modeled. In order to generate a multi-phase striking planning which can destroy the target system efficiently, the Markov Decision Process based planning method is proposed, and a heuristic is calculated to reduce the search space of the problem. Efficiency of the model is showed by a case, which shows that commander can raise their decision efficiency by this method.
引用
收藏
页码:154 / 159
页数:6
相关论文
共 50 条
  • [31] A Markov Decision Process-Based Opportunistic Spectral Access
    Arunthavanathan, Senthuran
    Kandeepan, Sithamparanathan
    Evans, Robin. J.
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2016, 5 (05) : 544 - 547
  • [32] Optimal Replacement Policy of Services Based on Markov Decision Process
    Pillai, Sandhya S.
    Narendra, Nanjangud C.
    2009 IEEE INTERNATIONAL CONFERENCE ON SERVICES COMPUTING, 2009, : 176 - +
  • [33] Multimodality treatment planning using the Markov decision process: a comprehensive study of applications and challenges
    Singh P.
    Singh S.
    Mishra A.
    Mishra S.K.
    Research on Biomedical Engineering, 2024, 40 (02) : 435 - 450
  • [34] Quality Control for Express Items Based on Markov Decision Process
    Han, Xu
    Li, Yisong
    2016 INTERNATIONAL CONFERENCE ON LOGISTICS, INFORMATICS AND SERVICE SCIENCES (LISS' 2016), 2016,
  • [35] An Integrated Inventory and Workforce Planning Markov Decision Process Model with a Variable Production Rate
    AlDurgam, Mohammad M.
    IFAC PAPERSONLINE, 2019, 52 (13): : 2792 - 2797
  • [36] A Markov Decision Process Approach to Dynamic Power Management in a Cluster System
    Okamura, Hiroyuki
    Miyata, Satoshi
    Dohi, Tadashi
    IEEE ACCESS, 2015, 3 : 3039 - 3047
  • [37] A Real-Time Path Planning Algorithm Based on the Markov Decision Process in a Dynamic Environment for Wheeled Mobile Robots
    Chen, Yu-Ju
    Jhong, Bing-Gang
    Chen, Mei-Yung
    ACTUATORS, 2023, 12 (04)
  • [38] Service Migration Algorithm Based on Markov Decision Process with Multiple Service Types and Multiple System Factors
    Ma, Anhua
    Pan, Su
    Zhou, Weiwei
    CHINESE JOURNAL OF ELECTRONICS, 2024, 33 (06) : 1515 - 1525
  • [39] A Finite Horizon Markov Decision Process Based Reinforcement Learning Control of a Rapid Thermal Processing system
    Pradeep, D. John
    Noel, Mathew Mithra
    JOURNAL OF PROCESS CONTROL, 2018, 68 : 218 - 225
  • [40] Markov Decision Process-Based Collision Avoidance Method for Multirotor Small Unmanned Aircraft System
    Sato, Gaku
    Yokoi, Hiroshi
    Toratani, Daichi
    Koga, Tadashi
    2022 61ST ANNUAL CONFERENCE OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS (SICE), 2022, : 1329 - 1334