A 0/1 Constrained Optimization Solving Sample Average Approximation for Chance Constrained Programming

被引:0
作者
Zhou, Shenglong [1 ]
Pan, Lili [2 ]
Xiu, Naihua [1 ]
Li, Geoffrey Ye [3 ]
机构
[1] Beijing Jiaotong Univ, Sch Math & Stat, Beijing 100044, Peoples R China
[2] Shandong Univ Technol, Dept Math, Shandong 255000, Peoples R China
[3] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
0/1 constrained optimization; sample average approximation; chance constrained programming; optimality conditions; semismooth Newton method; local convergence rate; PROBABILITY FUNCTIONS; CONVEX APPROXIMATIONS; GRADIENT FORMULAS; NEWTON; REGULARIZATION; CONVERGENCE; MANAGEMENT; ALGORITHM; NETWORKS; SYSTEMS;
D O I
10.1287/moor.2023.0149
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Sample average approximation (SAA) is a tractable approach for dealing with chance constrained programming, a challenging stochastic optimization problem. The constraint of SAA is characterized by the 0/1 loss function, which results in considerable complexities in devising numerical algorithms. Most existing methods have been devised based on reformulations of SAA, such as binary integer programming or relaxed problems. However, the development of viable methods to directly tackle SAA remains elusive, let alone providing theoretical guarantees. In this paper, we investigate a general 0/1 constrained optimization, providing a new way to address SAA rather than its reformulations. Specifically, starting with deriving the Bouligand tangent and Fre<acute accent>chet normal cones of the 0/1 constraint, we establish several optimality conditions. One of them can be equivalently expressed by a system of equations, enabling the development of a semismooth Newtontype algorithm. The algorithm demonstrates a locally superlinear or quadratic convergence rate under standard assumptions along with nice numerical performance compared with several leading solvers.
引用
收藏
页数:29
相关论文
共 50 条
  • [31] Nonlinear fuzzy chance constrained programming problem
    Panda G.
    Dash J.K.
    OPSEARCH, 2014, 51 (2) : 270 - 279
  • [32] A Linear Programming Approximation of Distributionally Robust Chance-Constrained Dispatch With Wasserstein Distance
    Zhou, Anping
    Yang, Ming
    Wang, Mingqiang
    Zhang, Yuming
    IEEE TRANSACTIONS ON POWER SYSTEMS, 2020, 35 (05) : 3366 - 3377
  • [33] A high performance neural network model for solving chance constrained optimization problems
    Nazemi, Alireza
    Tahmasbi, Narges
    NEUROCOMPUTING, 2013, 121 : 540 - 550
  • [34] A simulated annealing approach for reliability-based chance-constrained programming
    Sakalli, Umit Sami
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2014, 30 (04) : 497 - 508
  • [35] Optimal blending under general uncertainties: A chance-constrained programming approach
    Yang, Yu
    COMPUTERS & CHEMICAL ENGINEERING, 2023, 171
  • [36] Managing Data Quality of the Data Warehouse: A Chance-Constrained Programming Approach
    Liu, Qi
    Feng, Gengzhong
    Tayi, Giri Kumar
    Tian, Jun
    INFORMATION SYSTEMS FRONTIERS, 2021, 23 (02) : 375 - 389
  • [37] EVALUATING VARIANCE REDUCTION TECHNIQUES WITHIN A SAMPLE AVERAGE APPROXIMATION METHOD FOR A CONSTRAINED INVENTORY POLICY OPTIMIZATION PROBLEM
    Uenlue, Yasin
    Rossetti, Manuel D.
    PROCEEDINGS OF THE 2011 WINTER SIMULATION CONFERENCE (WSC), 2011, : 1624 - 1635
  • [38] Chance-Constrained Programming : a Tool for Solving Linear Eddy Current Inverse Problem
    Zorgati, Riadh
    Duchene, Bernard
    ELECTROMAGNETIC NONDESTRUCTIVE EVALUATION (XII), 2009, 32 : 305 - 312
  • [39] Rectangular chance constrained geometric optimization
    Liu, Jia
    Peng, Shen
    Lisser, Abdel
    Chen, Zhiping
    OPTIMIZATION AND ENGINEERING, 2020, 21 (02) : 537 - 566
  • [40] An Augmented Lagrangian Decomposition Method for Chance-Constrained Optimization Problems
    Bai, Xiaodi
    Sun, Jie
    Zheng, Xiaojin
    INFORMS JOURNAL ON COMPUTING, 2021, 33 (03) : 1056 - 1069