Flexible Differentiable Optimization via Model Transformations

被引:0
作者
Besancon, Mathieu [1 ]
Garcia, Joaquim Dias [2 ,3 ]
Legat, Benoit [4 ]
Sharma, Akshay [5 ]
机构
[1] Zuse Inst Berlin, D-14195 Berlin, Germany
[2] PSR, BR-22250040 Rio De Janeiro, RJ, Brazil
[3] Pontifıcia Univ Catolica Rio Janeiro, BR-22451900 Rio De Janeiro, RJ, Brazil
[4] Katholieke Univ Leuven, STADIUS Ctr Dynam Syst Signal Proc & Data Analyt, Dept Elect Engn ESAT, B-3001 Leuven, Belgium
[5] Columbia Univ, New York, NY 10027 USA
基金
欧洲研究理事会;
关键词
differentiable optimization; implicit differentiation; automatic differentiation; convex optimization; conic optimization; EQUATIONS;
D O I
10.1287/ijoc.2022.0283
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce DiffOpt.jl, a Julia library to differentiate through the solution of optimization problems with respect to arbitrary parameters present in the objective and/or constraints. The library builds upon MathOptInterface, thus leveraging the rich ecosystem of solvers and composing well with modeling languages like JuMP. DiffOpt offers both forward and reverse differentiation modes, enabling multiple use cases from hyperparameter optimization to backpropagation and sensitivity analysis, bridging constrained optimization with end-to-end differentiable programming. DiffOpt is built on two known rules for differentiating quadratic programming and conic programming standard forms. However, thanks to its ability to differentiate through model transformations, the user is not limited to these forms and can differentiate with respect to the parameters of any model that can be reformulated into these standard forms. This notably includes programs mixing affine conic constraints and convex quadratic constraints or objective function.
引用
收藏
页码:456 / 478
页数:24
相关论文
共 60 条
[1]  
Agrawal A, 2020, PREPRINT
[2]  
Agrawal A., 2019, J. Appl. Numer. Optim., V1, P107, DOI DOI 10.23952/JANO.1.2019.2.02
[3]  
Agrawal A, 2019, ADV NEUR IN, V32
[4]  
Amos B., 2019, Differentiable optimization-based modeling for machine learning
[5]  
Amos B, 2017, PR MACH LEARN RES, V70
[6]  
Anjos M, 2003, SEMIDEFINITE PROGRAM
[7]  
Berthet Q., 2020, Advances in Neural Information Processing Sys-tems, V33, P9508
[8]  
Besancon M, 2023, MATBESANCONMATHOPTSE
[9]  
Besancon M, 2023, FLEXIBLE DIFFERENTIA, DOI [10.1287/ijoc.2022.0283.cd, DOI 10.1287/IJOC.2022.0283.CD]
[10]   Julia: A Fresh Approach to Numerical Computing [J].
Bezanson, Jeff ;
Edelman, Alan ;
Karpinski, Stefan ;
Shah, Viral B. .
SIAM REVIEW, 2017, 59 (01) :65-98