Robust Black-Box Optimization for Stochastic Search and Episodic Reinforcement Learning

被引:0
作者
Huttenrauch, Maximilian [1 ]
Neumann, Gerhard [1 ]
机构
[1] Karlsruhe Inst Technol, Dept Comp Sci, Karlsruhe, Germany
关键词
black-box optimization; stochastic search; derivative-free optimization; evolution strategies; episodic reinforcement learning; EVOLUTIONARY;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Black -box optimization is a versatile approach to solve complex problems where the objective function is not explicitly known and no higher order information is available. Due to its general nature, it finds widespread applications in function optimization as well as machine learning, especially episodic reinforcement learning tasks. While traditional black -box optimizers like CMA-ES may falter in noisy scenarios due to their reliance on ranking -based transformations, a promising alternative emerges in the form of the Model -based Relative Entropy Stochastic Search (MORE) algorithm. MORE can be derived from natural policy gradients and compatible function approximation and directly optimizes the expected fitness without resorting to rankings. However, in its original formulation, MORE often cannot achieve state of the art performance. In this paper, we improve MORE by decoupling the update of the search distribution's mean and covariance and an improved entropy scheduling technique based on an evolution path resulting in faster convergence, and a simplified model learning approach in comparison to the original paper. We show that our algorithm performs comparable to state-of-the-art black -box optimizers on standard benchmark functions. Further, it clearly outperforms ranking -based methods and other policy -gradient based black -box algorithms as well as state of the art deep reinforcement learning algorithms when used for episodic reinforcement learning tasks.
引用
收藏
页码:1 / 44
页数:44
相关论文
共 50 条
[21]   A data-driven robust optimization algorithm for black-box cases: An application to hyper-parameter optimization of machine learning algorithms [J].
Seifi, Farshad ;
Azizi, Mohammad Javad ;
Niaki, Seyed Taghi Akhavan .
COMPUTERS & INDUSTRIAL ENGINEERING, 2021, 160
[22]   Noisy Multiobjective Black-Box Optimization using Bayesian Optimization [J].
Wang, Hongyan ;
Xu, Hua ;
Yuan, Yuan ;
Deng, Junhui ;
Sun, Xiaomin .
PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, :239-240
[23]   Optimization of black-box problems using Smolyak grids and polynomial approximations [J].
Kieslich, Chris A. ;
Boukouvala, Fani ;
Floudas, Christodoulos A. .
JOURNAL OF GLOBAL OPTIMIZATION, 2018, 71 (04) :845-869
[24]   Constrained robust Bayesian optimization of expensive noisy black-box functions with guaranteed regret bounds [J].
Kudva, Akshay ;
Sorourifar, Farshud ;
Paulson, Joel A. .
AICHE JOURNAL, 2022, 68 (12)
[25]   Solution polishing via path relinking for continuous black-box optimization [J].
Papageorgiou, Dimitri J. ;
Kronqvist, Jan ;
Ramanujam, Asha ;
Kor, James ;
Kim, Youngdae ;
Li, Can .
OPTIMIZATION LETTERS, 2025, 19 (03) :463-504
[26]   Analysis-of-Marginal-Tail-Means (ATM): A Robust Method for Discrete Black-Box Optimization [J].
Mak, Simon ;
Jeff Wu, C. F. .
TECHNOMETRICS, 2019, 61 (04) :545-559
[27]   Generating set search using simplex gradients for bound-constrained black-box optimization [J].
Dedoncker, Sander ;
Desmet, Wim ;
Naets, Frank .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (01) :35-65
[28]   Generating set search using simplex gradients for bound-constrained black-box optimization [J].
Sander Dedoncker ;
Wim Desmet ;
Frank Naets .
Computational Optimization and Applications, 2021, 79 :35-65
[29]   Implicitly and densely discrete black-box optimization problems [J].
Vicente, Luis Nunes .
OPTIMIZATION LETTERS, 2009, 3 (03) :475-482
[30]   Implicitly and densely discrete black-box optimization problems [J].
Luis Nunes Vicente .
Optimization Letters, 2009, 3 :475-482