2ND DERIVATIVES FOR OPTIMIZING EIGENVALUES OF SYMMETRICAL MATRICES

被引:61
作者
OVERTON, ML [1 ]
WOMERSLEY, RS [1 ]
机构
[1] UNIV NEW S WALES,SCH MATH,KENSINGTON,NSW 2033,AUSTRALIA
关键词
NONSMOOTH OPTIMIZATION; MULTIPLE EIGENVALUES;
D O I
10.1137/S089547989324598X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Let A denote an n x n real symmetric matrix-valued function depending on a vector of real parameters, x is an element of R(m). Assume that A is a twice continuously differentiable function of x, with the second derivative satisfying a Lipschitz condition. Consider the following optimization problem: minimize the largest eigenvalue of A(x). Let x* denote a minimum. Typically, the maximum eigenvalue of A(x*) is multiple, so the objective function is not differentiable at x*, and straightforward application of Newton's method is not possible. Nonetheless, the formulation of a method with local quadratic convergence is possible. The main idea is to minimize the maximum eigenvalue subject to a constraint that this eigenvalue has a certain multiplicity. The manifold Omega of matrices with such multiple eigenvalues is parameterized using a matrix exponential representation, leading to the definition of an appropriate Lagrangian function. Consideration of the Hessian of this Lagrangian function leads to the second derivative matrix used by Newton's method. The convergence proof is nonstandard because the parameterization of Omega is explicitly known only in the limit. In the special case of multiplicity one, the maximum eigenvalue is a smooth function and the method reduces to a standard Newton iteration.
引用
收藏
页码:697 / 718
页数:22
相关论文
共 19 条