A Stochastic Primal-Dual Method for Optimization with Conditional Value at Risk Constraints

被引:0
|
作者
Avinash N. Madavan
Subhonmesh Bose
机构
[1] University of Illinois at Urbana-Champaign,
关键词
Primal-dual optimization; Stochastic optimization; Risk-sensitive optimization; Conditional value at risk; 90C15; 90C25; 90C30;
D O I
暂无
中图分类号
学科分类号
摘要
We study a first-order primal-dual subgradient method to optimize risk-constrained risk-penalized optimization problems, where risk is modeled via the popular conditional value at risk (CVaR) measure. The algorithm processes independent and identically distributed samples from the underlying uncertainty in an online fashion and produces an η/K\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\eta /\sqrt{K}$$\end{document}-approximately feasible and η/K\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\eta /\sqrt{K}$$\end{document}-approximately optimal point within K iterations with constant step-size, where η\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\eta $$\end{document} increases with tunable risk-parameters of CVaR. We find optimized step sizes using our bounds and precisely characterize the computational cost of risk aversion as revealed by the growth in η\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\eta $$\end{document}. Our proposed algorithm makes a simple modification to a typical primal-dual stochastic subgradient algorithm. With this mild change, our analysis surprisingly obviates the need to impose a priori bounds or complex adaptive bounding schemes for dual variables to execute the algorithm as assumed in many prior works. We also draw interesting parallels in sample complexity with that for chance-constrained programs derived in the literature with a very different solution architecture.
引用
收藏
页码:428 / 460
页数:32
相关论文
共 50 条
  • [1] A Stochastic Primal-Dual Method for Optimization with Conditional Value at Risk Constraints
    Madavan, Avinash N.
    Bose, Subhonmesh
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 190 (02) : 428 - 460
  • [2] A Primal-Dual Algorithm for Distributed Stochastic Optimization with Equality Constraints
    Du, Kai-Xin
    Chen, Xing-Min
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 5586 - 5591
  • [3] Distributed Primal-Dual Method for Convex Optimization With Coupled Constraints
    Su, Yanxu
    Wang, Qingling
    Sun, Changyin
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 523 - 535
  • [4] A stochastic primal-dual method for a class of nonconvex constrained optimization
    Jin, Lingzi
    Wang, Xiao
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 83 (01) : 143 - 180
  • [5] A stochastic primal-dual method for a class of nonconvex constrained optimization
    Lingzi Jin
    Xiao Wang
    Computational Optimization and Applications, 2022, 83 : 143 - 180
  • [6] PRIMAL-DUAL ALGORITHMS FOR OPTIMIZATION WITH STOCHASTIC DOMINANCE
    Haskell, William B.
    Shanthikumar, J. George
    Shen, Z. Max
    SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (01) : 34 - 66
  • [7] Stochastic primal-dual scheduling subject to rate constraints
    Wang, Xin
    Giannakis, Georgios B.
    2007 IEEE WIRELESS COMMUNICATIONS & NETWORKING CONFERENCE, VOLS 1-9, 2007, : 1529 - 1533
  • [8] NESTT: A Nonconvex Primal-Dual Splitting Method for Distributed and Stochastic Optimization
    Hajinezhad, Davood
    Hong, Mingyi
    Zhao, Tuo
    Wang, Zhaoran
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [9] PRIMAL-DUAL STOCHASTIC SUBGRADIENT METHOD FOR LOG-DETERMINANT OPTIMIZATION
    Wu, Songwei
    Yu, Hang
    Dauwels, Justin
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3117 - 3121
  • [10] Primal-Dual Mirror Descent Method for Constraint Stochastic Optimization Problems
    Bayandina, A. S.
    Gasnikov, A. V.
    Gasnikova, E. V.
    Matsievskii, S. V.
    COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 2018, 58 (11) : 1728 - 1736