Speed and convergence properties of gradient algorithms for optimization of IMRT

被引:43
|
作者
Zhang, XD
Liu, H
Wang, XC
Dong, L
Wu, QW
Mohan, R
机构
[1] Univ Texas, MD Anderson Canc Ctr, Dept Radiat Phys, Houston, TX 77030 USA
[2] William Beaumont Hosp, Dept Radiat Oncol, Royal Oak, MI 48073 USA
关键词
IMRT; optimization; multi-objective; local minima; gradient;
D O I
10.1118/1.1688214
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Gradient algorithms are the most commonly employed search methods in the routine optimization of IMRT plans. It is well known that local minima can exist for dose-volume-based and biology-based objective functions. The purpose of this paper is to compare the relative speed of different gradient algorithms, to investigate the strategies for accelerating the optimization process, to assess the validity of these strategies, and to study the convergence properties of these algorithms for dose-volume and biological objective functions. With these aims in mind, we implemented Newton's, conjugate gradient (CG), and the steepest decent (SD) algorithms for dose-volume- and EUD-based objective functions. Our implementation of Newton's algorithm approximates the second derivative matrix (Hessian) by its diagonal. The standard SD algorithm and the CG algorithm with "line minimization" were also implemented. In addition, we investigated the use of a variation of the CG algorithm, called the "scaled conjugate gradient" (SCG) algorithm. To accelerate the optimization process, we investigated the validity of the use of a "hybrid optimization" strategy, in which approximations to calculated dose distributions are used during most of the iterations. Published studies have indicated that getting trapped in local minima is not a significant problem. To investigate this issue further, we first obtained, by trial and error, and starting with uniform intensity distributions, the parameters of the dose-volume- or EUD-based objective functions which produced IMRT plans that satisfied the clinical requirements. Using the resulting optimized intensity distributions as the initial guess, we investigated the possibility of getting trapped in a local minimum. For most of the results presented, we used a lung cancer case. To illustrate the generality of our methods, the results for a prostate case are also presented. For both dose-volume and EUD based objective functions, Newton's method far outperforms other algorithms in terms of speed. The SCG algorithm, which avoids expensive "line minimization," can speed up the standard CG algorithm by at least a factor of 2. For the same initial conditions, all algorithms converge essentially to the same plan. However, we demonstrate that for any of the algorithms Studied, starting with previously optimized intensity distributions as the initial guess but for different objective function parameters, the solution frequently gets trapped in local minima. We found that the initial intensity distribution obtained from IMRT optimization utilizing objective function parameters, which favor a specific anatomic structure, would lead to a local minimum corresponding to that structure. Our results indicate that from among the gradient algorithms tested, Newton's method appears to be the fastest by far. Different gradient algorithms have the same convergence properties for dose-volume- and EUD-based objective functions. The hybrid dose calculation strategy is valid and can significantly accelerate the optimization process. The degree of acceleration achieved depends on the type of optimization problem being addressed (e.g., IMRT optimization, intensity modulated beam configuration optimization, or objective function parameter optimization). Under special conditions, gradient algorithms will get trapped in local minima, and reoptimization, starting with the results of previous optimization, will lead to solutions that are generally not significantly different from the local minimum. (C) 2004 American Association of Physicists in Medicine.
引用
收藏
页码:1141 / 1152
页数:12
相关论文
共 50 条
  • [11] Linear Convergence of Asynchronous Gradient Push Algorithm for Distributed Optimization
    Li, Huaqing
    Cheng, Huqiang
    Lu, Qingguo
    Wang, Zheng
    Huang, Tingwen
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2025, 55 (03): : 2147 - 2159
  • [12] Simultaneous optimization of sequential IMRT plans
    Popple, RA
    Prellop, PB
    Spencer, SA
    De Los Santos, JF
    Duan, J
    Fiveash, JB
    Brezovich, IA
    MEDICAL PHYSICS, 2005, 32 (11) : 3257 - 3266
  • [13] Convergence Rates of Gradient Descent and MM Algorithms for Bradley-Terry Models
    Vojnovic, Milan
    Yun, Se-Young
    Zhou, Kaifang
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [14] Convergence Analysis of Distributed Gradient Descent Algorithms With One and Two Momentum Terms
    Liu, Bing
    Chai, Li
    Yi, Jingwen
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (03) : 1511 - 1522
  • [15] An Improved NSGA-II Algorithm for the Optimization of IMRT Inverse Planning
    Li Guoli
    Lin Lin
    Li Zhizhong
    PROCEEDINGS OF THE 2009 2ND INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING AND INFORMATICS, VOLS 1-4, 2009, : 936 - 938
  • [16] Convergence and Privacy of Decentralized Nonconvex Optimization With Gradient Clipping and Communication Compression
    Li, Boyue
    Chi, Yuejie
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2025, 19 (01) : 273 - 282
  • [17] A New Modification of Conjugate Gradient Method with Global Convergence Properties
    Dawahdeh, Mahmoud
    Mamat, Mustafa
    Alhawarat, Ahmad
    Rivaie, Mohd
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON MATHEMATICS, ENGINEERING AND INDUSTRIAL APPLICATIONS 2018 (ICOMEIA 2018), 2018, 2013
  • [18] Hybrid Riemannian conjugate gradient methods with global convergence properties
    Sakai, Hiroyuki
    Iiduka, Hideaki
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2020, 77 (03) : 811 - 830
  • [19] Convergence Properties of Monotone and Nonmonotone Proximal Gradient Methods Revisited
    Kanzow, Christian
    Mehlitz, Patrick
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 195 (02) : 624 - 646
  • [20] Convergence Properties of Genetic Algorithms in a Wide Variety of Noisy Environments
    Nakama, Takehiko
    CMC-COMPUTERS MATERIALS & CONTINUA, 2009, 14 (01): : 35 - 60