MANIFOLD LEARNING FOR ACCELERATING COARSE-GRAINED OPTIMIZATION

被引:3
作者
Pozharskiy, Dmitry [1 ]
Wichrowski, Noah J. [2 ]
Duncan, Andrew B. [3 ]
Pavliotis, Grigorios A. [3 ]
Kevrekidis, Ioannis G. [4 ,5 ]
机构
[1] Princeton Univ, Dept Chem & Biol Engn, Princeton, NJ 08544 USA
[2] Johns Hopkins Univ, Dept Appl Math & Stat AMS, Baltimore, MD 21218 USA
[3] Imperial Coll London, Dept Math, London SW7 2AZ, England
[4] Johns Hopkins Univ, Dept Chem & Biomol Engn, Baltimore, MD 21218 USA
[5] Johns Hopkins Univ, AMS, Baltimore, MD 21218 USA
来源
JOURNAL OF COMPUTATIONAL DYNAMICS | 2020年 / 7卷 / 02期
基金
英国工程与自然科学研究理事会;
关键词
Manifold learning; derivative-free optimization; Langevin equation; simulated annealing; diffusion maps; geometric harmonics; GLOBAL OPTIMIZATION; GENERAL-METHOD; TIME-SERIES; DIFFUSIONS; MODELS; ALGORITHMS; TOOL;
D O I
10.3934/jcd.2020021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Algorithms proposed for solving high-dimensional optimization problems with no derivative information frequently encounter the "curse of dimensionality," becoming ineffective as the dimension of the parameter space grows. One feature of a subclass of such problems that are effectively lowdimensional is that only a few parameters (or combinations thereof) are important for the optimization and must be explored in detail. Knowing these parameters/combinations in advance would greatly simplify the problem and its solution. We propose the data-driven construction of an effective (coarsegrained, "trend") optimizer, based on data obtained from ensembles of brief simulation bursts with an "inner" optimization algorithm, that has the potential to accelerate the exploration of the parameter space. The trajectories of this "effective optimizer" quickly become attracted onto a slow manifold parameterized by the few relevant parameter combinations. We obtain the parameterization of this low-dimensional, effective optimization manifold on the fly using data mining/manifold learning techniques on the results of simulation (inner optimizer iteration) burst ensembles and exploit it locally to "jump" forward along this manifold. As a result, we can bias the exploration of the parameter space towards the few, important directions and, through this "wrapper algorithm," speed up the convergence of traditional optimization algorithms.
引用
收藏
页码:511 / 536
页数:26
相关论文
共 52 条
[1]   Maximum likelihood estimation of discretely sampled diffusions:: A closed-form approximation approach [J].
Aït-Sahalia, Y .
ECONOMETRICA, 2002, 70 (01) :223-262
[2]   Closed-form likelihood expansions for multivariate diffusions [J].
Ait-Sahalia, Yacine .
ANNALS OF STATISTICS, 2008, 36 (02) :906-937
[3]  
BARTON RR, 1994, 1994 WINTER SIMULATION CONFERENCE PROCEEDINGS, P237
[4]   AN EMPIRICAL-COMPARISON OF ALTERNATIVE MODELS OF THE SHORT-TERM INTEREST-RATE [J].
CHAN, KC ;
KAROLYI, GA ;
LONGSTAFF, FA ;
SANDERS, AB .
JOURNAL OF FINANCE, 1992, 47 (03) :1209-1227
[5]   Reduced Models in Chemical Kinetics via Nonlinear Data-Mining [J].
Chiavazzo, Eliodoro ;
Gear, Charles W. ;
Dsilva, Carmeline J. ;
Rabin, Neta ;
Kevrekidis, Ioannis G. .
PROCESSES, 2014, 2 (01) :112-140
[6]   Geometric harmonics: A novel tool for multiscale out-of-sample extension of empirical functions [J].
Coifman, Ronald R. ;
Lafon, Stephane .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2006, 21 (01) :31-52
[7]   Diffusion maps [J].
Coifman, Ronald R. ;
Lafon, Stephane .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2006, 21 (01) :5-30
[8]   Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps [J].
Coifman, RR ;
Lafon, S ;
Lee, AB ;
Maggioni, M ;
Nadler, B ;
Warner, F ;
Zucker, SW .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2005, 102 (21) :7426-7431
[9]  
Conn A.R., 2000, MPS-SIAM Series on Optimization, DOI DOI 10.1137/1.9780898719857
[10]  
Conn AR, 2009, MOS-SIAM SER OPTIMIZ, V8, P1