Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems

被引:32
作者
Alekseev, A. K. [2 ]
Navon, I. M. [1 ]
Steward, J. L. [1 ]
机构
[1] Florida State Univ, Dept Comp Sci, Tallahassee, FL 32306 USA
[2] ENERGIA, RSC, Dept Aerodynam & Heat Transfer, Kaliningrad, Moscow Region, Russia
关键词
large-scale minimization methods; inverse problems; adjoint parameter estimation; ill-posed problems; VARIATIONAL DATA ASSIMILATION; CONJUGATE-GRADIENT METHOD; QUASI-NEWTON METHODS; UNCONSTRAINED OPTIMIZATION; GUARANTEED DESCENT;
D O I
10.1080/10556780802370746
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier-Stokes equation model was used for adjoint parameter estimation. The methods compared consist of three versions of nonlinear conjugate-gradient (CG) method. quasi-Newton Broyden-Fletcher-Goldfarb-Shanno (BFGS), the limited-memory quasi-Newton (L-BFGS) [D.C. Liu and J. Nocedal, Oil the limited memory BFGS method for large scale minimization, Math. Program. 45 (1989), pp. 503-528], truncated Newton (T-N) method [S.G Nash, Preconditioning of truncated Newton methods, SIAM J. Sci. Stat. Comput. 6 (1985), pp. 599-616, S.G. Nash. Newton-type minimization via the Lanczos method, SIAM J. Numer. Anal, 21 (1984), pp. 770-788] and a new hybrid algorithm proposed by Morales and Nocedal [J.L. Morales and J. Nocedal, Enriched methods for large-scale unconstrained optimization, Comput. Optim. Apple 21 (2002). pp. 143-154]. For all the methods employed and tested, the gradient of the cost function is obtained via all adjoint method. A detailed description of the algorithmic form of minimization algorithms employed in the minimization comparison is provided. For the inviscid case, the CG-descent method of Hager [W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and efficient fine search, SIAM J. Optim. 16 (1) (2005), pp. 170192] performed the best followed closely by the hybrid method [J.L. Morales and J. Nocedal. Enriched methods for large-scale unconstrained optimization, Comput. Optim. Appl. 21 (2002),pp. 143-154], while in the viscous case, the hybrid method emerged as the best performed followed by CG [D.F. Shanno and K.H. Phua, Remark on algorithm 500. Minimization of unconstrained multivariate functions. ACM Trails. Math. Softw. 6 (1980). pp. 618-622] and CG-descent [W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 (1) (2005), pp. 170-192]. This required all adequate choice of parameters in the CG-descent method as well as controlling the number of L-BFGS and T-N iterations to be interlaced in the hybrid method.
引用
收藏
页码:63 / 87
页数:25
相关论文
共 37 条
[1]  
ADAMS LM, 1996, P AMS IMS SIAM SUMM
[2]  
Alekseev AK, 2001, INT J NUMER METH FL, V36, P971, DOI 10.1002/fld.163
[3]   The analysis of an ill-posed problem using multi-scale resolution and second-order adjoint techniques [J].
Alekseev, AK ;
Navon, IM .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2001, 190 (15-17) :1937-1953
[4]  
[Anonymous], 1980, ACM Trans Math Softw (TOMS), DOI DOI 10.1145/355921.355933
[5]  
[Anonymous], 1972, NUMERICAL METHODS NO
[6]  
[Anonymous], 1996, EXTREME METHODS SOLV
[7]   An analysis of a hybrid optimization method for variational data assimilation [J].
Daescu, DN ;
Navon, IM .
INTERNATIONAL JOURNAL OF COMPUTATIONAL FLUID DYNAMICS, 2003, 17 (04) :299-306
[8]   Performance of hybrid methods for large-scale unconstrained optimization as applied to models of proteins [J].
Das, B ;
Meirovitch, H ;
Navon, IM .
JOURNAL OF COMPUTATIONAL CHEMISTRY, 2003, 24 (10) :1222-1231
[9]   VARIABLE METRIC METHOD FOR MINIMIZATION [J].
Davidon, William C. .
SIAM JOURNAL ON OPTIMIZATION, 1991, 1 (01) :1-17
[10]   QUASI-NEWTON METHODS, MOTIVATION AND THEORY [J].
DENNIS, JE ;
MORE, JJ .
SIAM REVIEW, 1977, 19 (01) :46-89