Two "well-known" properties of subgradient optimization

被引:44
|
作者
Anstreicher, Kurt M. [1 ]
Wolsey, Laurence A. [2 ]
机构
[1] Univ Iowa, Dept Management Sci, Iowa City, IA 52242 USA
[2] Univ Catholique Louvain, Ctr Operat Res & Econometr, B-1348 Louvain, Belgium
关键词
Subgradient optimization; Divergent series; Lagrangian relaxation; Primal recovery; VOLUME ALGORITHM; PRIMAL SOLUTIONS; LINEAR-PROGRAMS; CONVERGENCE;
D O I
10.1007/s10107-007-0148-y
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The subgradient method is both a heavily employed and widely studied algorithm for non-differentiable optimization. Nevertheless, there are some basic properties of subgradient optimization that, while "well known" to specialists, seem to be rather poorly known in the larger optimization community. This note concerns two such properties, both applicable to subgradient optimization using the divergent series steplength rule. The first involves convergence of the iterative process, and the second deals with the construction of primal estimates when subgradient optimization is applied to maximize the Lagrangian dual of a linear program. The two topics are related in that convergence of the iterates is required to prove correctness of the primal construction scheme.
引用
收藏
页码:213 / 220
页数:8
相关论文
共 50 条