Automatic MILP solver configuration by learning problem similarities

被引:1
作者
Hosny, Abdelrahman [1 ]
Reda, Sherief [1 ,2 ]
机构
[1] Brown Univ, Dept Comp Sci, Providence, RI 02912 USA
[2] Brown Univ, Sch Engn, Providence, RI 02912 USA
关键词
Mixed integer linear programming; Algorithm configuration; Metric learning; Deep learning; ALGORITHM;
D O I
10.1007/s10479-023-05508-x
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
A large number of real-world optimization problems can be formulated as Mixed Integer Linear Programs (MILP). MILP solvers expose numerous configuration parameters to control their internal algorithms. Solutions, and their associated costs or runtimes, are significantly affected by the choice of the configuration parameters, even when problem instances have the same number of decision variables and constraints. On one hand, using the default solver configuration leads to suboptimal solutions. On the other hand, searching and evaluating a large number of configurations for every problem instance is time-consuming and, in some cases, infeasible. In this study, we aim to predict configuration parameters for unseen problem instances that yield lower-cost solutions without the time overhead of searching-and-evaluating configurations at the solving time. Toward that goal, we first investigate the cost correlation of MILP problem instances that come from the same distribution when solved using different configurations. We show that instances that have similar costs using one solver configuration also have similar costs using another solver configuration in the same runtime environment. After that, we present a methodology based on Deep Metric Learning to learn MILP similarities that correlate with their final solutions' costs. At inference time, given a new problem instance, it is first projected into the learned metric space using the trained model, and configuration parameters are instantly predicted using previously-explored configurations from the nearest neighbor instance in the learned embedding space. Empirical results on real-world problem benchmarks show that our method predicts configuration parameters that improve solutions' costs by up to 38% compared to existing approaches.
引用
收藏
页码:909 / 936
页数:28
相关论文
共 67 条
  • [1] Achterberg T, 2008, LECT NOTES COMPUT SC, V5015, P6, DOI 10.1007/978-3-540-68155-7_4
  • [2] [Anonymous], 2008, P 14 ACM SIGKDD INT
  • [3] MaxSAT by improved instance-specific algorithm configuration
    Ansotegui, Carlos
    Gabas, Joel
    Malitsky, Yuri
    Sellmann, Meinolf
    [J]. ARTIFICIAL INTELLIGENCE, 2016, 235 : 26 - 39
  • [4] Balaprakash P, 2007, LECT NOTES COMPUT SC, V4771, P108
  • [5] Extending an Integer Formulation for the Guillotine 2D Bin Packing Problem
    Becker, Henrique
    Araujo, Olinto
    Buriol, Luciana S.
    [J]. PROCEEDINGS OF THE XI LATIN AND AMERICAN ALGORITHMS, GRAPHS AND OPTIMIZATION SYMPOSIUM, 2021, 195 : 499 - 507
  • [6] Bello Irwan, 2016, ARXIV
  • [7] Machine learning for combinatorial optimization: A methodological tour d'horizon
    Bengio, Yoshua
    Lodi, Andrea
    Prouvost, Antoine
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2021, 290 (02) : 405 - 421
  • [8] Bergstra J, 2012, J MACH LEARN RES, V13, P281
  • [9] Birattari M., 2002, P 4 ANN C GENETIC EV, P11, DOI DOI 10.5555/2955491.2955494
  • [10] Birattari M., 2009, TUNING METAHEURISTIC, V197, DOI DOI 10.1007/978-3-642-00483-4