A 0/1 Constrained Optimization Solving Sample Average Approximation for Chance Constrained Programming

被引:0
作者
Zhou, Shenglong [1 ]
Pan, Lili [2 ]
Xiu, Naihua [1 ]
Li, Geoffrey Ye [3 ]
机构
[1] Beijing Jiaotong Univ, Sch Math & Stat, Beijing 100044, Peoples R China
[2] Shandong Univ Technol, Dept Math, Shandong 255000, Peoples R China
[3] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
0/1 constrained optimization; sample average approximation; chance constrained programming; optimality conditions; semismooth Newton method; local convergence rate; PROBABILITY FUNCTIONS; CONVEX APPROXIMATIONS; GRADIENT FORMULAS; NEWTON; REGULARIZATION; CONVERGENCE; MANAGEMENT; ALGORITHM; NETWORKS; SYSTEMS;
D O I
10.1287/moor.2023.0149
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Sample average approximation (SAA) is a tractable approach for dealing with chance constrained programming, a challenging stochastic optimization problem. The constraint of SAA is characterized by the 0/1 loss function, which results in considerable complexities in devising numerical algorithms. Most existing methods have been devised based on reformulations of SAA, such as binary integer programming or relaxed problems. However, the development of viable methods to directly tackle SAA remains elusive, let alone providing theoretical guarantees. In this paper, we investigate a general 0/1 constrained optimization, providing a new way to address SAA rather than its reformulations. Specifically, starting with deriving the Bouligand tangent and Fre<acute accent>chet normal cones of the 0/1 constraint, we establish several optimality conditions. One of them can be equivalently expressed by a system of equations, enabling the development of a semismooth Newtontype algorithm. The algorithm demonstrates a locally superlinear or quadratic convergence rate under standard assumptions along with nice numerical performance compared with several leading solvers.
引用
收藏
页数:29
相关论文
共 67 条
[1]   Nonlinear Chance Constrained Problems: Optimality Conditions, Regularization and Solvers [J].
Adam, Lukas ;
Branda, Martin .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2016, 170 (02) :419-436
[2]  
Ahmed S., 2008, STATE OF THE ART DEC, V10, P261, DOI DOI 10.1287/EDUC.1080.0048
[3]   Lipschitzian stability of parametric variational inequalities over generalized polyhedra in Banach spaces [J].
Ban, Liqun ;
Mordukhovich, Boris S. ;
Song, Wen .
NONLINEAR ANALYSIS-THEORY METHODS & APPLICATIONS, 2011, 74 (02) :441-461
[4]   An exact approach for solving integer problems under probabilistic constraints with random technology matrix [J].
Beraldi, Patrizia ;
Bruni, Maria Elena .
ANNALS OF OPERATIONS RESEARCH, 2010, 177 (01) :127-137
[5]   On the algorithmic solution of optimization problems subject to probabilistic/robust (probust) constraints [J].
Berthold, Holger ;
Heitsch, Holger ;
Henrion, Rene ;
Schwientek, Jan .
MATHEMATICAL METHODS OF OPERATIONS RESEARCH, 2022, 96 (01) :1-37
[6]   1-bit compressive sensing [J].
Boufounos, Petros T. ;
Baraniuk, Richard G. .
2008 42ND ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS, VOLS 1-3, 2008, :16-21
[7]   Uncertain convex programs: randomized solutions and confidence levels [J].
Calafiore, G ;
Campi, MC .
MATHEMATICAL PROGRAMMING, 2005, 102 (01) :25-46
[8]   COST HORIZONS AND CERTAINTY EQUIVALENTS - AN APPROACH TO STOCHASTIC-PROGRAMMING OF HEATING OIL [J].
CHARNES, A ;
COOPER, WW ;
SYMONDS, GH .
MANAGEMENT SCIENCE, 1958, 4 (03) :235-263
[9]   Global and superlinear convergence of the smoothing Newton method and its application to general box constrained variational inequalities [J].
Chen, X ;
Qi, L ;
Sun, D .
MATHEMATICS OF COMPUTATION, 1998, 67 (222) :519-540
[10]  
Clarke F. H., 1983, Optimization and nonsmooth analysis